Find answers from the community

Updated 2 months ago

Hi there, is there a way to set some

Hi there, is there a way to set some sort of strictness for my context mode chat engine? i.e. how strict the engine is about returning information that is in the context or not?
W
T
4 comments
What chat_mode are you trying?
So context works on the knowledge base plus external conversation as well. so maybe you could try a chat engine which does not do external conversation and solely reply based on the context given or if you want to work with this mode only try modifying the system_prompt: https://docs.llamaindex.ai/en/stable/examples/chat_engine/chat_engine_context.html
Okay but for example, I see there's a faithfulness evaluator: https://docs.llamaindex.ai/en/stable/examples/evaluation/faithfulness_eval.html Is there an option to set that up with the context chat engine?
Add a reply
Sign up and join the conversation on Discord