Find answers from the community

Updated 6 months ago

So mind as well use the OpenAI call with

At a glance

The community members are discussing the use of the OpenAI call with a vanilla vllm server. One community member suggests that vllmServer is meant to connect to an existing server, not start a new one. Another community member indicates they have already figured this out and are using the OpenAILike llm. A third community member mentions that a previous issue was due to a typo. A link to a GitHub issue is also provided, but no explicit answer is given.

Useful resources
So mind as well use the OpenAI call with a vanilla vllm server then?
L
A
4 comments
VllmServer doesn't start a server, it's meant to connct to a server you already have running
Yeah I kinda already figured it out, we are just using the OpenAILike llm πŸ‘
It wasn't working before because of a typo πŸ˜›
Add a reply
Sign up and join the conversation on Discord