Find answers from the community

Updated 10 months ago

anyone have any information on how to

anyone have any information on how to run/create OpenAI agents running towards Local LLM (llamacpp)
W
h
L
6 comments
You want to use local llm ?
I guess defining service_context with local llm and embed model should work.
It may not work as intended due to the llm limitations. Somewhere in the llamaindex docs there is a table of different popular llm models and their capabilities.
Cant find it now
Yeah it wont work as good as OpenAI, Compatibility report on Open source LLM: https://docs.llamaindex.ai/en/stable/module_guides/models/llms.html#open-source-llms
Thx, i was still looking for it
Add a reply
Sign up and join the conversation on Discord