Find answers from the community

Updated 8 months ago

since hf is openai-compatiable can we

since hf is openai-compatiable can we swap that with openai model or would have to make a custom model lol. Langchain llms are broken again on their side
L
d
24 comments
You can use the OpenAILike class
pip install llama-index-llms-openai-like

Plain Text
from llama_index.llms.openai_like import OpenAILike

llm = OpenAILike(model="some model", api_key="fake", api_base="....")
I spend more time reading llama index than my actual school work
lol theres a lot in here πŸ˜†
whats the install?
Its up there, but pip install llama-index-llms-openai-like
oh didnt see it sorry
@Logan M my do i just my huggingface host or do i have to intalize it like this

from openai import OpenAI

initialize the client but point it to TGI

client = OpenAI(
base_url="<ENDPOINT_URL>" + "/v1/", # replace with your endpoint url
api_key="<HF_API_TOKEN>", # replace with your token
)

im getting
llama_index.llms.openai.base.OpenAI._complete in 0.19009121318146227 seconds as it raised NotFoundError: Not Found.
Retrying llama_index.llms.openai.base.OpenAI._complete in 0.4330981257225379 seconds as it raised NotFoundError: Not Found.
Retrying llama_index.llms.openai.base.OpenAI._complete in 3.315326791265811 seconds as it raised NotFoundError: Not Found.
Plain Text
from llama_index.llms.openai_like import OpenAILike

# Instantiate an OpenAILike model
llm = OpenAILike(
    model="tgi",
    api_key="<HF_API_TOKEN>",
    api_base="<ENDPOINT_URL>" + "/v1/",
    is_chat_model=True,
    is_local=False,
    is_function_calling_model=False,
  context_window=32000,
)
Should be able to do that
(With the proper context window size of course)
gotcha gotcha I dont the the right
that always thorws me off
Yea you don't need the angle brackets
Retrying llama_index.llms.openai.base.OpenAI._chat in 8.648145082827636 seconds as it raised UnprocessableEntityError: Error code: 422 - {'error': 'Template error: template not found', 'error_type': 'template_error'}.
better not ther yet
had to set to false lol
Ah that makes sense
Add a reply
Sign up and join the conversation on Discord