Alright, got another one for ya'll.
I'm trying to use vllm to run a model. vllm provides an openai-compatible API, BUT i need to use a custom model name.
I can't seem to find out how.
It seems like it'll always throw the error at line 188 in openai_utils.py.
Is that right? Is there now way to put in custom model names? If not, i'm probably going to try opening a PR for that
Add a reply
Sign up and join the conversation on Discord