The community members are discussing how to use a hosted llamacpp server in the llamaindex library for a chat engine. They are trying to use the OpenAILike class, but are encountering an error. The community members have provided code examples and tried various troubleshooting steps, such as updating readers, creating a new environment, and downgrading packages. Eventually, one community member was able to resolve the issue by setting up a new environment and isolating the llamacpp server installation. The community members also discuss the possibility of using the hosted OpenAILike LLM directly in the chat engine.
Hi how to use hosted llamacpp server in llamaindex for chat engine. Iam trying by importing OpenAiLike and passing model name and base_api still getting error
It's working now. I setup new environment and installed everything new . In my previous environment I have installed llamacpp server too , I don't know if that caused any issue. Now I isolated both environments and it's working. Thanks Logan. I troubled you