Find answers from the community

Updated 3 months ago

ChatEngine

I'm using index.as_chat_engine(..). It does not only use data from my vectorstore but uses also data from elsewhere. Can I avoid that? I use chat_mode="condense_plus_context" and this prompt \nInstruction: Use the previous chat history, or the context above, to interact and help the user. Don't use any other informations." But it is not correct.
W
R
L
5 comments
condense plus context allows the bot to answer from your query as well as general interaction queries as well.

For your use case I think: https://docs.llamaindex.ai/en/stable/examples/chat_engine/chat_engine_condense_question.html

Should be good.

Rest you can check out all the chat modes here for more understanding: https://docs.llamaindex.ai/en/stable/module_guides/deploying/chat_engines/root.html#modules
But with "context" it is better
You can modify the prompt for the condense_plus_content chat engine
Add a reply
Sign up and join the conversation on Discord