Find answers from the community

Updated 2 months ago

Hi here, I am testing localLLm with

Hi here, I am testing localLLm with Ollama, where I am using model phi3:medium/phi3.5. Now I want to restrict the model to only respond from the local database, how should I acheive it
W
S
J
7 comments
You need to try this with setting prompts and then trying
thanks, is there any example I can refer
try adding a context prompt to your chat engine or query engine
Explicitly state in there for the model to only answer based off of context
@SumitPandit
Add a reply
Sign up and join the conversation on Discord