Find answers from the community

Updated last year

Llamacpp

im looking for an example of llama_index calling a locally hosted llama.cpp api. Does anyone have an example?
W
M
4 comments
You need to use the custom llm class to interact with a hosted llm server


Refer to this thread:
https://discord.com/channels/1059199217496772688/1163395083475898410/1163727902169366559
that is to connect to a running Llama.cpp server api?
Since this is your own hosted llm, you'll need to use custom llm abstract from llamaindex and define the interaction
Add a reply
Sign up and join the conversation on Discord