The community member is using index.as_chat_engine(..) and is encountering an issue where the chat engine is using data from sources other than their vector store. They are using the chat_mode="condense_plus_context" and a specific prompt, but it is not working as expected.
The comments suggest that the condense_plus_context chat mode allows the bot to answer from the user's query as well as general interaction queries. A community member recommends checking the documentation for the condense_plus_context chat engine and modifying the prompt to address the issue.
There is no explicitly marked answer in the comments, but the community members provide suggestions and references to the relevant documentation to help the original poster resolve their issue.
I'm using index.as_chat_engine(..). It does not only use data from my vectorstore but uses also data from elsewhere. Can I avoid that? I use chat_mode="condense_plus_context" and this prompt \nInstruction: Use the previous chat history, or the context above, to interact and help the user. Don't use any other informations." But it is not correct.