Log in
Log into community
Find answers from the community
View all posts
Related posts
Did this answer your question?
π
π
π
Powered by
Hall
Inactive
Updated 3 months ago
0
Follow
Once the response is received from the
Once the response is received from the
Inactive
0
Follow
L
Lokesh
last year
Β·
Once the response is received from the openAI how to differentiate between the response which is from the context or if it is from the public knowledge ?
T
L
6 comments
Share
Open in Discord
T
Teemu
last year
The prompt templates by default specify that only answer based on the given context information. GPT-4 especially does a good job with following that
L
Lokesh
last year
But if I want the answer from public domain if the answer is not in the trained documents what should be done ?
T
Teemu
last year
You can also specify that in the prompt templates or then you can use a separate query engine for those questions
L
Lokesh
last year
Can you please redirect to any examples for this use case if you know any ?
T
Teemu
last year
You can use these:
https://docs.llamaindex.ai/en/stable/module_guides/models/prompts.html****
Or build separate query engines:
https://docs.llamaindex.ai/en/stable/examples/query_engine/RouterQueryEngine.html
I think this is a ready made implementation for those sorts of use-cases but it's for the chat engine:
https://docs.llamaindex.ai/en/stable/examples/chat_engine/chat_engine_react.html
L
Lokesh
last year
π thank u
Add a reply
Sign up and join the conversation on Discord
Join on Discord