Find answers from the community

Updated 3 months ago

Hello, I'm using a query engine to fetch

Hello, I'm using a query engine to fetch data from qdrant and then generate a response. I found that the user input sometimes is very poor and so I try to enhance that by asking the llm to give a list of semantic keywords to the users input. that as such works well... now I wonder how can I manage to use the semantic keywords as search parameters while still use the users input as query. or is there a way where I prepare the context myself and then do the llm response in a second step ?
W
o
2 comments
Check chat_engine with condense + context mode.
That helps in first preparing the updated question based on user conversation and then asks the index.

https://docs.llamaindex.ai/en/stable/examples/chat_engine/chat_engine_condense_plus_context/
id did not do the trick, I talk now direct to the llm that way I got it working, but thank you πŸ™‚
Add a reply
Sign up and join the conversation on Discord