Log in
Log into community
Find answers from the community
View all posts
Related posts
Did this answer your question?
😞
😐
😃
Powered by
Hall
Inactive
Updated 11 months ago
0
Follow
Hello everyone,
Hello everyone,
Inactive
0
Follow
T
Thomas1234
11 months ago
·
Hello everyone,
How can you give a system prompt to the instruct version of mixtral in a chat setting ?
L
T
7 comments
Share
Open in Discord
L
Logan M
11 months ago
Like the name
instruct
might imply, it wasn't trained with the concept of a system prompt
Best you can do is modify the instruction
T
Thomas1234
11 months ago
@Logan M, so it’s best not used for chat purposes ? Or how can you modify the instruction ?
L
Logan M
11 months ago
Best not used for chat purposes.
What LLM class are you using for this?
L
Logan M
11 months ago
Not sure if this was you, but also saw this on github
https://github.com/run-llama/llama_index/issues/10072#issuecomment-1894115576
T
Thomas1234
11 months ago
I was trying some stuff with VLLM and their openai like api. I guess I’ll start the history with the system prompt and see what it does
L
Logan M
11 months ago
ah yea, vLLM handles the formatting for you
L
Logan M
11 months ago
no idea what it will do for that lol
Add a reply
Sign up and join the conversation on Discord
Join on Discord