Find answers from the community

Updated 4 months ago

Chat engine

At a glance
Hello! As many others I'm trying to build a "chatbot" to query a local datasource. But I'm a little lost to which would be the best approach with llama index. I have tried the retriever and citation query engines with text-davinci-003 and it works quite fine except that I don't get the chat experience since it doesn't remember the conversation history. So I tried using a chat engine (index.as_chat_engine) but then I don't get any sources. Anyone has any suggestions?
1
S
W
p
12 comments
I’m having the same problem. I feel like its either I get new node everytime or not at all.
have you tried β€œreact” mode?
When you say source you mean the source text used to generate the final response? @Senna @potami
by getting new source, i mean retrieving new nodes instead of just using the response from previous chat
@Senna Yes, actually with chat_mode='condense_question' I do get content in response.source_nodes. When I use chat_mode='react' the response.source_nodes is just an empty list. However, the synthesized answer is better when I use react chat mode.
@WhiteFang_Jr I do get relevant context retrieved from my index, and the synthesized answer is fine, but the response source_nodes is an empty list
You could try not passing the chat history in the default instance creation of engine and if the conversation is for different user and you do not want others conversation history to have any effect then you'll have to create a new instance for the user.

I may be understanding your query wrong. lol
If so please tell me again πŸ˜…
Llama-Index is using Langchain under the hood for react mode last i checked the code for it. Maybe that's why you are not getting anything in there.
I have not checked the React mode. Will have to look into that
@WhiteFang_Jr I think you may be right in that the issue lies in the langchain agent integration. Maybe the source nodes get lost somewhere along the way
Hopefully soon we can replace the react mode with our own agent implementation πŸ™ Definitely in the pipeline
@Logan M That's great to hear, looking forward to it πŸ™‚
Something to try then, will look into react mode
Add a reply
Sign up and join the conversation on Discord