Log in
Log into community
Find answers from the community
View all posts
Related posts
Did this answer your question?
π
π
π
Powered by
Hall
Inactive
Updated 7 months ago
0
Follow
i use OpenAILike as llm, so im thinking
i use OpenAILike as llm, so im thinking
Inactive
0
Follow
B
BanaanBakje
7 months ago
Β·
i use OpenAILike as llm, so im thinking thats why, but im not sure what other llm to use since i use vLLM as interference server
L
B
6 comments
Share
Open in Discord
L
Logan M
7 months ago
Did you set
is_chat_model=True
?
L
Logan M
7 months ago
nvm you fixed it haha
B
BanaanBakje
7 months ago
yeah i fixed the special characters by updating vLLM, but i still have issue with it saying the context and going back and forward with itself
B
BanaanBakje
7 months ago
i read somewhere you mentioned it could be the prompt, i havent been successful in doing that with my chat engine if that might help
L
Logan M
7 months ago
From my understanding, if you set
is_chat_model=True
in the LLM, vLLM will handle the prompt formatting for you
B
BanaanBakje
7 months ago
oh wow it seems that this did it β€οΈ thank you so much once again π
Add a reply
Sign up and join the conversation on Discord
Join on Discord