Find answers from the community

Updated 6 months ago

ChatEngine

At a glance

The community member is using index.as_chat_engine(..) and is encountering an issue where the chat engine is using data from sources other than their vector store. They are using the chat_mode="condense_plus_context" and a specific prompt, but it is not working as expected.

The comments suggest that the condense_plus_context chat mode allows the bot to answer from the user's query as well as general interaction queries. A community member recommends checking the documentation for the condense_plus_context chat engine and modifying the prompt to address the issue.

There is no explicitly marked answer in the comments, but the community members provide suggestions and references to the relevant documentation to help the original poster resolve their issue.

Useful resources
I'm using index.as_chat_engine(..). It does not only use data from my vectorstore but uses also data from elsewhere. Can I avoid that? I use chat_mode="condense_plus_context" and this prompt \nInstruction: Use the previous chat history, or the context above, to interact and help the user. Don't use any other informations." But it is not correct.
W
R
L
5 comments
condense plus context allows the bot to answer from your query as well as general interaction queries as well.

For your use case I think: https://docs.llamaindex.ai/en/stable/examples/chat_engine/chat_engine_condense_question.html

Should be good.

Rest you can check out all the chat modes here for more understanding: https://docs.llamaindex.ai/en/stable/module_guides/deploying/chat_engines/root.html#modules
But with "context" it is better
You can modify the prompt for the condense_plus_content chat engine
Add a reply
Sign up and join the conversation on Discord