Log in
Log into community
Find answers from the community
View all posts
Related posts
Did this answer your question?
π
π
π
Powered by
Hall
Inactive
Updated last year
0
Follow
Llamacpp
Llamacpp
Inactive
0
Follow
M
M00nshine
last year
Β·
im looking for an example of llama_index calling a locally hosted llama.cpp api. Does anyone have an example?
W
M
4 comments
Share
Open in Discord
W
WhiteFang_Jr
last year
You need to use the custom llm class to interact with a hosted llm server
Refer to this thread:
https://discord.com/channels/1059199217496772688/1163395083475898410/1163727902169366559
M
M00nshine
last year
that is to connect to a running Llama.cpp server api?
W
WhiteFang_Jr
last year
Yes
W
WhiteFang_Jr
last year
Since this is your own hosted llm, you'll need to use custom llm abstract from llamaindex and define the interaction
Add a reply
Sign up and join the conversation on Discord
Join on Discord