Find answers from the community

Updated 3 months ago

Hello @everyone i have doubt in setting.

Hello @everyone i have doubt in setting.llm my question is the setting.llm=huggingfaceLLM() is working fine but i'm using vllm OpenAI-Compatible Server to inference model while using setting.llm = OpenAI(api_key=openai_api_key, base_url=openai_api_base) is not working why? but endpoint is working fine only error in setting.llm = OpenAI(api_key=openai_api_key, base_url=openai_api_base)
W
1 comment
Can you share the error
Add a reply
Sign up and join the conversation on Discord