The community members are discussing the use of the OpenAI call with a vanilla vllm server. One community member suggests that vllmServer is meant to connect to an existing server, not start a new one. Another community member indicates they have already figured this out and are using the OpenAILike llm. A third community member mentions that a previous issue was due to a typo. A link to a GitHub issue is also provided, but no explicit answer is given.