Find answers from the community

Updated 7 months ago

i use OpenAILike as llm, so im thinking

i use OpenAILike as llm, so im thinking thats why, but im not sure what other llm to use since i use vLLM as interference server
L
B
6 comments
Did you set is_chat_model=True ?
nvm you fixed it haha
yeah i fixed the special characters by updating vLLM, but i still have issue with it saying the context and going back and forward with itself
i read somewhere you mentioned it could be the prompt, i havent been successful in doing that with my chat engine if that might help
From my understanding, if you set is_chat_model=True in the LLM, vLLM will handle the prompt formatting for you
oh wow it seems that this did it ❀️ thank you so much once again πŸ˜„
Add a reply
Sign up and join the conversation on Discord