Find answers from the community

Updated 2 years ago

Langchain

At a glance
like, instead of using the query_engine.query() method call
L
j
10 comments
Yes there is an option for this! Let me find the notebook
Wait, are you using create_llama_agent, or just using llama index as a custom tool?
Whichever way you do things changes my answer lol
I believe the later, (in https://gpt-index.readthedocs.io/en/latest/guides/tutorials/building_a_chatbot.html, Setting up the Tools + Langchain Chatbot Agent part)
If I use llama index as a custom cool, do I then lose the flexibility to retrieve the source for each query?
I just looked into create_llama_agent: https://github.com/jerryjliu/llama_index/blob/main/examples/chatbot/Chatbot_SEC.ipynb is this notebook a good reference? (once we get response, then we can do response.get_formatted_sources() )
Yea you'll notice in that notebook there's a return_sources kwarg

Then the response from the agent is a json you can parse

The other option is using llama index as a custom tool in langchain without our wrappers

In that case, in the func of each tool, you can use a wrapper function around query that still calls query, but then also checks response.sources_nodes to do whatever you want with that πŸ’ͺ
would you mind elaborating on the second case? I got little confused about where to put the wrapper function, sorry πŸ˜…
No worries!

Check out this notebook https://github.com/jerryjliu/llama_index/blob/main/examples/langchain_demo/LangchainDemo.ipynb

Instead of the lambda for a func, you can do a function that calls query, as well as doing something with the sources nodes πŸ’ͺ
this is actually awesome! Tysm!
Add a reply
Sign up and join the conversation on Discord