Find answers from the community

Updated 3 months ago

Hey team

Hey team
I use chat engine in context mode to Q&A over context. Even if I specifying that model should use only context, it still answers some general questions and adds additional info to specific questions, that is not available in the context.
Is there a way to use only context. Tried different promts.
W
V
8 comments
Did you try providing the system_prompt ?

Plain Text
from llama_index.memory import ChatMemoryBuffer

memory = ChatMemoryBuffer.from_defaults(token_limit=1500)

chat_engine = index.as_chat_engine(
    chat_mode="context",
    memory=memory,
    system_prompt=(
        "You are a chatbot, able to have normal interactions, as well as talk"
        " about an essay discussing Paul Grahams life."
    ),
)
Yep, tried it
Instruction should be clear. Can you share the instruction if possible?
Sure. Some examples:
Answer the questions based on the context below, and if the question can't be answered based on the context, say "I don't know" .

Don`t answer the question if the context isn't helpful.
Youa re using GPT right and not open source?
Yes, tried both 3.5 and 4
Just read this on Context chat mode doc

This approach is simple, and works for questions directly related to the knowledge base and general interactions.

So the mode allows general interaction as well.

I guess you could give OpenAI chat mode and condense chat mode a try
Add a reply
Sign up and join the conversation on Discord