Log in
Log into community
Find answers from the community
View all posts
Related posts
Did this answer your question?
๐
๐
๐
Powered by
Hall
Inactive
Updated 7 months ago
0
Follow
hey guys, Im new to llamaindex. I have
hey guys, Im new to llamaindex. I have
Inactive
0
Follow
p
phoenix baby
7 months ago
ยท
hey guys, Im new to llamaindex. I have question that whether we can use local llm instead of default OpenAI for generator?
P
1 comment
Share
Open in Discord
P
PwnosaurusRex
7 months ago
Try this out: Uses Ollama, which is ran locally:
https://docs.llamaindex.ai/en/stable/examples/llm/ollama/?h=ollama
You could also change the endpoint to localhost and point it to any other self-run server:
https://docs.llamaindex.ai/en/stable/examples/llm/localai/
You could use LM Studio, etc. as well for the second one.
Add a reply
Sign up and join the conversation on Discord
Join on Discord