Find answers from the community

Updated 6 months ago

Hello @everyone i have doubt in setting.

At a glance

The community member is having an issue with setting the setting.llm parameter in their code. They mention that setting.llm=huggingfaceLLM() is working fine, but when they try to use setting.llm = OpenAI(api_key=openai_api_key, base_url=openai_api_base), it is not working, even though the endpoint is working fine. Another community member asks the original poster to share the error they are encountering, but there is no explicitly marked answer in the comments.

Hello @everyone i have doubt in setting.llm my question is the setting.llm=huggingfaceLLM() is working fine but i'm using vllm OpenAI-Compatible Server to inference model while using setting.llm = OpenAI(api_key=openai_api_key, base_url=openai_api_base) is not working why? but endpoint is working fine only error in setting.llm = OpenAI(api_key=openai_api_key, base_url=openai_api_base)
W
1 comment
Can you share the error
Add a reply
Sign up and join the conversation on Discord