Find answers from the community

Updated 2 years ago

Sources

At a glance
Hi guys! How to return sources from the index when using ConversationBufferMemory & Tool, the run method on the agent_chain returns only the result of the query but no sources / nodes
L
n
8 comments
You are using llama index as a toolnin langchain right? How did you set up your tools?
defined it like that :

Plain Text
 tools = [
                Tool(
                    name="GPT Index",
                    func=lambda q: str(index.as_query_engine().query(q)),
                    description="useful for when you want to answer questions about the author. The input to this tool should be a complete english sentence.",
                    return_direct=True,
                ),
            ]

            
memory = ConversationBufferMemory(memory_key="chat_history")
llm = ChatOpenAI(temperature=0)
agent_chain = initialize_agent(tools, llm, agent="conversational-react-description", memory=memory)
            
Cool! So instead of a lambda, you could use a wrapper function instead. That way you can get access to the source nodes

Plain Text
def query_with_sources(query_str, query_engine=None):
    response = query_engine.query(query_str)
    print(response.source_nodes)
    return str(response)

...
func=lambda q: query_with_sources(q, query_engine=index.query_engine),
...
Just a very basic example
oh perfect this is much clear thank you very much
Hello @Logan M what is the purpose of description in Tool Object ? Looks like very sensitive regarding the return of the result
So the LLM reads all the tool names and descriptions, and decides which tool to use (if any) based on that
So any changes in the description will change how it picks tools
Add a reply
Sign up and join the conversation on Discord