Log in
Log into community
Find answers from the community
View all posts
Related posts
Did this answer your question?
๐
๐
๐
Powered by
Hall
Inactive
Updated 2 years ago
0
Follow
Is there a way with llama index of
Is there a way with llama index of
Inactive
0
Follow
a
aioverdrivefriend
2 years ago
ยท
Is there a way with llama index of splitting out general background knowledge of the LLM versus the context you are feeding in?
j
a
3 comments
Share
Open in Discord
j
jerryjliu0
2 years ago
this might be a bit of prompt engineering on the query part. we already have outer instructions telling the LLM to ignore prior knowledge and only use the context
j
jerryjliu0
2 years ago
https://github.com/jerryjliu/gpt_index/blob/main/gpt_index/prompts/default_prompts.py
a
aioverdrivefriend
2 years ago
Thanks Jerry!
Add a reply
Sign up and join the conversation on Discord
Join on Discord