Find answers from the community

Updated last year

Non-OpenAI

Is there a way create a simple LlamaIndex app that does not involve OpenAI at all? I'd like to avoid any risk of communicating with external LLMs or embedding models. I have my own classes for llm and embed_model, but even just importing basic things from llama_index seems to trigger errors, like this one: "You tried to access openai.Completion, but this is no longer supported in openai>=1.0.0..."
W
1 comment
Yes you can totally use open-source models with llamaindex.

LlamaIndex has prepared a compatibility report for various open-source llm model along with google colab for you to get started with them.

https://docs.llamaindex.ai/en/stable/module_guides/models/llms.html#open-source-llms

And same for the embedding models too!! You can use your local embedding models, models from HF etc
https://docs.llamaindex.ai/en/stable/module_guides/models/embeddings.html#embeddings
Add a reply
Sign up and join the conversation on Discord