Find answers from the community

v
vignes
Offline, last seen 3 months ago
Joined September 25, 2024
How to change this default prompt? only way i know now is to modify llama source directly....is this the right way?
3 comments
L
v
how to use "chat_engine" with "NodeWithScore"?
i can place a list of NodeWithScore when calling "response_synthesizer.synthesize", but what about use with "chat engines"?.....any pointer would be greatly appreciated 🙏😁
Plain Text
        from llama_index.core.schema import NodeWithScore
        from llama_index.core import get_response_synthesizer

        response_synthesizer = get_response_synthesizer(
            response_mode="tree_summarize")

        user_question = kwargs["query"]
        
        # get response based on nodes as context
        response = response_synthesizer.synthesize(
            f"{user_question}, use only text in context, context is a person's life description",
            nodes=kwargs["input_documents"]
        )

but how to use those same nodes kwargs["input_documents"] with an instance of "chat engine"...that's where i'm stuck🤔?
4 comments
L
v