Find answers from the community

Updated last year

I am using ```chat engine index as chat

I am using
Plain Text
chat_engine = index.as_chat_engine(
                    chat_mode="context",
                    system_prompt="You are a chatbot with access to extra context information about dogs. Only answer questions if you can find the answer in the context, otherwise, kindly inform the user that you cannot find the answer.",
                    chat_history=custom_chat_history
    )
All is working well and the LLM only answers questions about "dogs". My issue is that when I introduce the recent chat history, if someone asks a question about "cats", the LLM properly handles the quesiton the first try but any follow-up question including cats will be answered. Any recommendations for handling this situation?
e
L
11 comments
@Logan M I could use another bail out on this one πŸ˜‰ My guess is that context mode is counting the chat history as part of the context?
Yea, the chat history is part of the context. But isn't that the main use case for using a chat engine?
I want the Chat to have context of what we're discussing so if there is a follow up question e.g. Where do THEY live?, Who is SHE? etc. The chat can answer with that knowledge.
My problem is that once the topic is introduced into the chat history the system prompt to not answer questions about X gets ignored.
Maybe the system prompt just needs more verbose instructions? πŸ˜…
I'm trying I promise! πŸ˜‰
Under the hood is there a way I can reference "context" vs. "chat history" that the context mode can interpret?
hmm.. I think chat history could be referenced as previous messages ? And context is already directly called context under the hood πŸ‘€
Add a reply
Sign up and join the conversation on Discord