Log in
Log into community
Find answers from the community
View all posts
Related posts
Did this answer your question?
๐
๐
๐
Powered by
Hall
Inactive
Updated 10 months ago
0
Follow
anyone have any information on how to
anyone have any information on how to
Inactive
0
Follow
h
hansson0728
10 months ago
ยท
anyone have any information on how to run/create OpenAI agents running towards Local LLM (llamacpp)
W
h
L
6 comments
Share
Open in Discord
W
WhiteFang_Jr
10 months ago
You want to use local llm ?
I guess defining service_context with local llm and embed model should work.
h
hansson0728
10 months ago
maybe
L
LORKA
10 months ago
It may not work as intended due to the llm limitations. Somewhere in the llamaindex docs there is a table of different popular llm models and their capabilities.
L
LORKA
10 months ago
Cant find it now
W
WhiteFang_Jr
10 months ago
Yeah it wont work as good as OpenAI, Compatibility report on Open source LLM:
https://docs.llamaindex.ai/en/stable/module_guides/models/llms.html#open-source-llms
L
LORKA
10 months ago
Thx, i was still looking for it
Add a reply
Sign up and join the conversation on Discord
Join on Discord