Log in
Log into community
Find answers from the community
s
F
Y
a
P
3,278
View all posts
Related posts
Did this answer your question?
๐
๐
๐
Powered by
Hall
Inactive
Updated 9 months ago
0
Follow
anyone have any information on how to
anyone have any information on how to
0
Follow
h
hansson0728
9 months ago
ยท
anyone have any information on how to run/create OpenAI agents running towards Local LLM (llamacpp)
W
h
L
6 comments
Share
Open in Discord
W
WhiteFang_Jr
9 months ago
You want to use local llm ?
I guess defining service_context with local llm and embed model should work.
h
hansson0728
9 months ago
maybe
L
LORKA
9 months ago
It may not work as intended due to the llm limitations. Somewhere in the llamaindex docs there is a table of different popular llm models and their capabilities.
L
LORKA
9 months ago
Cant find it now
W
WhiteFang_Jr
9 months ago
Yeah it wont work as good as OpenAI, Compatibility report on Open source LLM:
https://docs.llamaindex.ai/en/stable/module_guides/models/llms.html#open-source-llms
L
LORKA
9 months ago
Thx, i was still looking for it
Add a reply
Sign up and join the conversation on Discord
Join on Discord