Find answers from the community

Updated 8 months ago

hey guys, Im new to llamaindex. I have

hey guys, Im new to llamaindex. I have question that whether we can use local llm instead of default OpenAI for generator?
P
1 comment
Try this out: Uses Ollama, which is ran locally: https://docs.llamaindex.ai/en/stable/examples/llm/ollama/?h=ollama

You could also change the endpoint to localhost and point it to any other self-run server:
https://docs.llamaindex.ai/en/stable/examples/llm/localai/

You could use LM Studio, etc. as well for the second one.
Add a reply
Sign up and join the conversation on Discord