Find answers from the community

Updated 2 months ago

Chat

Which components can i best use to build a RAG-Chat that works for short inquires, after first answer was given?
I am currently using ContextChatEngine, which works fine for the first interaction, but if a user poses a follow up question, which is short and more related to the previous interaction as to the content in the vector store.

To make it understandable hopefully:
  1. Vectorstore Content [".... Granny Smith Apples ... ", " ... Green Frog...",...]
  2. Question: What colors can apples have?
  3. Answer by ContextChat Engine: Apples can have red, green (like Granny Smith) or yellow color
  4. Question: Give me some more green varieties?
  5. Answer by ContextChat Engine: .... Green Frog ... <= So the similarity to green frog is closest to green frog and the answer would be something related to frogs
L
T
4 comments
You probably want an agent or condense question chat engine for that
@Logan M Thanks a lot. I already was looking into Condense Question Chat Engine, but was not sure, if its of help here.

Do you have particular agents in mind?
I think just an openai agent with a query engine tool or two would be fine

https://docs.llamaindex.ai/en/stable/examples/agent/openai_agent_with_query_engine.html
Okay nice! I will give it a try! Thanks a lot for all your contributions and effort you put into this project πŸ‘
Add a reply
Sign up and join the conversation on Discord