Find answers from the community

Updated 3 months ago

I'm using `OpenAILike` to talk to a Vllm

I'm using OpenAILike to talk to a Vllm instance. I need to pass a custom stop token, and currently the only way i can figure out how to do this is like this:

llm.complete(prompt, True, extra_body={"stop_token_ids":[...]})

This doesn't work with llm.predict because it interprets all remaining kwargs as prompt template expansion arguments. Is there any other way to get this KV in the outoging OpenAI-API request?
L
a
4 comments
setting additional_kwargs in the initial constructor?
OpenAILike(..., additional_kwargs={"extra_body": {"stop_token_ids": [...]}})

(I think, If im understanding)
worked, i did try that very early on but i didnt wrap it in the "extra_body" and got off on a tangent later in the chain. thanks again.
Add a reply
Sign up and join the conversation on Discord