Log in
Log into community
Find answers from the community
View all posts
Related posts
Did this answer your question?
π
π
π
Powered by
Hall
Inactive
Updated 3 months ago
0
Follow
Hi here, I am testing localLLm with
Hi here, I am testing localLLm with
Inactive
0
Follow
S
SumitPandit
3 months ago
Β·
Hi here, I am testing localLLm with Ollama, where I am using model phi3:medium/phi3.5. Now I want to restrict the model to only respond from the local database, how should I acheive it
W
S
J
7 comments
Share
Open in Discord
W
WhiteFang_Jr
3 months ago
You need to try this with setting prompts and then trying
S
SumitPandit
3 months ago
thanks, is there any example I can refer
J
Jake
3 months ago
try adding a context prompt to your chat engine or query engine
J
Jake
3 months ago
Explicitly state in there for the model to only answer based off of context
J
Jake
3 months ago
@SumitPandit
S
SumitPandit
3 months ago
thanks
S
SumitPandit
3 months ago
also, got, one example
https://medium.com/rahasak/build-rag-application-using-a-llm-running-on-local-computer-with-ollama-and-llamaindex-97703153db20
Add a reply
Sign up and join the conversation on Discord
Join on Discord