I started an OpenAI compatible server using VLLM, and the model name is "NousResearch/Meta-Llama-3-8B-Instruct", then I construct OpenAI(model=model, openai_api_key=openai_api_key, openai_api_base=openai_api_base, request_timeout=60.0), but got Unknown model error. How to use a local VLLM model?