Hi how to use hosted llamacpp server in llamaindex for chat engine. Iam trying by importing OpenAiLike and passing model name and base_api still getting error
It's working now. I setup new environment and installed everything new . In my previous environment I have installed llamacpp server too , I don't know if that caused any issue. Now I isolated both environments and it's working. Thanks Logan. I troubled you