Find answers from the community

Updated 7 months ago

Hi how to use hosted llamacpp server in

Hi how to use hosted llamacpp server in llamaindex for chat engine. Iam trying by importing OpenAiLike and passing model name and base_api still getting error
L
T
30 comments
What is the error?
Some BaseComponent error. Tired and stopped today. Any alternative apart from using OpenAiLike ? I want to use my base_api and use it for llm calls
from llama_index.llms.openai_like import OpenAILike

llm = OpenAILike(model="my model" api_base="https://hostname.com/v1", api_key="fake")

response = llm.complete("Hello World!")
print(str(response))
I replaced hostname.com with my IP and port.
I would need the full error to help debug πŸ™‚
OpenAILike works fine on my end
Getting validation error
Validation error for data source
Component _type subclass of BaseComponent expected
That seems like an env issue to me. Do you have the exact traceback?
You probably just need to update a reader
Iam just importing one line of code
From llama_index.llms.openai_like import OpenAILike
Then in 2nd line llm =OpenAILike(required params)
I know -- this is related to some other package you have installed
If you give me the exact traceback and I can suggest a fix
Iam behind proxy. I can send one small snip
Try updating the elastic search reader pip install -U llama-index-readers-elasticsearch

Or just uninstall it if you aren't using it
It was not installed. Now I installed and upgrades still same issue
Removed url for obvious reasons
Not sure. I can replicate. Looks like you are in a notebook though? Try restarting the runtime/notebook
Done refreshing kernel, created virtual environment. Using only 2 lines of code but giving error.
Don't know what to tell ya πŸ€·β€β™‚οΈ works fine on a fresh Google colab
Maybe try downgrading llama-index-core? Don't use a notebook? It's something cooked in you venv
Well, if you are ever tired of dealing with an issue, I should just fix it myself I guess πŸ˜‰
https://github.com/run-llama/llama_index/pull/12882
Can u post your colab code. And are u using llamacpp server api ?
not using llamacpp, but thats not related
It's working now. I setup new environment and installed everything new . In my previous environment I have installed llamacpp server too , I don't know if that caused any issue. Now I isolated both environments and it's working. Thanks Logan. I troubled you
Can we define llm.baseapi and using it in chat engine directly ?
not sure what you mean. You can give the api_base to the openailike llm, and then use it wherever
I want my hosted OpenAiLike llm to be used for chat engine also. I will try once
Add a reply
Sign up and join the conversation on Discord