Log in
Log into community
Find answers from the community
View all posts
Related posts
Did this answer your question?
π
π
π
Powered by
Hall
Inactive
Updated 9 months ago
0
Follow
Hello, I have one question, If I'm using
Hello, I have one question, If I'm using
Inactive
0
Follow
A
Andrei
9 months ago
Β·
Hello, I have one question, If I'm using Claude 3 from AWS Bedrock, when configuring the prompt for qa_prompt, refine_prompt, chat_prompt should all need to have "\n\nHuman: <prompt> \n\nAssistant:" or only the qa_prompt ? Thanks
L
A
5 comments
Share
Open in Discord
L
Logan M
9 months ago
Only the prompt, that should get handled under the hood (I think)
A
Andrei
9 months ago
do you mean only the qa_prompt, and for refine_prompt and chat_prompt is not needed ?
https://github.com/run-llama/llama_index/blob/main/llama-index-integrations/llms/llama-index-llms-bedrock/llama_index/llms/bedrock/utils.py#L40
L
Logan M
9 months ago
that comment is out of date. We send message dicts to anthropic
A
Andrei
9 months ago
I don't understand, so we no longer need to add this \n\nHuman: <prompt> \n\nAssistant:" for any prompt used for Anthropic ?
L
Logan M
9 months ago
Nope
Add a reply
Sign up and join the conversation on Discord
Join on Discord