Hi there, is there a way to set some sort of strictness for my context mode chat engine? i.e. how strict the engine is about returning information that is in the context or not?
So context works on the knowledge base plus external conversation as well. so maybe you could try a chat engine which does not do external conversation and solely reply based on the context given or if you want to work with this mode only try modifying the system_prompt: https://docs.llamaindex.ai/en/stable/examples/chat_engine/chat_engine_context.html