Find answers from the community

Updated 3 months ago

I’m using OpenAILike class with vLLM.

I’m using OpenAILike class with vLLM.
L
T
10 comments
yea, use that messages to prompt hook -- this will basically give you universal control on the LLM input
I might be using it wrong with the chat engine, but when I’m using the OpenAI like with tout the is chat model, even if the system prompt is empty it will send the message as follow
‘System:
Message body’
Is there a way to get rid of this system: at the beginning of every message ?
I’m thinking that it’s from the LLM metadata when none is being passed to the OpenAI like class ?
Hmm, will have to check this. That typically happens because is_chat_model is being set to false ?
Yes ! Even when I’m passing a system prompt after it still does
system: {system prompt}
No I set is chat model to false
ah, then that would be why 👀 If want is_chat_model=False, but want more control over the formatting of the LLM prompt, you can set a function hook

Plain Text
# very naive formatting, this is what the default is doing essentially
def messages_to_prompt(messages):
  return "\n".join([str(x) for x in messages])

llm = OpenAILike(...., messages_to_prompt=messages_to_prompt)
Thanks I’ll try it and I’ll let you know !
Add a reply
Sign up and join the conversation on Discord