Log in
Log into community
Find answers from the community
View all posts
Related posts
Did this answer your question?
π
π
π
Powered by
Hall
Inactive
Updated 9 months ago
0
Follow
Hi, I see index.as_chat_engine now takes
Hi, I see index.as_chat_engine now takes
Inactive
0
Follow
S
SeaCat
9 months ago
Β·
Hi, I see index.as_chat_engine now takes two prompt parameters: system and context. We have everything in one prompt, which one should be used? The documentation search is broken, it can't find any information on this method anymore.
W
S
2 comments
Share
Open in Discord
W
WhiteFang_Jr
9 months ago
If you see here:
https://github.com/run-llama/llama_index/blob/107b37e878cf4ebb798b2fec0ad08439d0d717da/llama-index-core/llama_index/core/chat_engine/condense_plus_context.py#L213
The system prompt and context prompts are combined to form the final system prompt.
So even if you have everything in one it is fine.
S
SeaCat
9 months ago
Thank you!
Add a reply
Sign up and join the conversation on Discord
Join on Discord