Find answers from the community

Updated 3 months ago

Hello, I've been trying to find an

Hello, I've been trying to find an answer in the docs, but, I'm not very well versed in this stuff yet. Would I be able to skip OpenAI all together and use google's gemini-pro/embedding llm for everything?
L
i
2 comments
Yes you can -- We have many integrations for LLMs and embedding models

Mostly just need to set them in your service context

Plain Text
from llama_index import ServiceContext, set_global_service_context
set_global_service_context(ServiceContext.from_defaults(llm=llm, embed_model=embed_model))


That covers 90% of cases, but then, there's a few stray components that take an llm as a kwarg directly, depending on what you are doing

Embedding models:
https://docs.llamaindex.ai/en/stable/module_guides/models/embeddings.html#list-of-supported-embeddings

LLMs
https://docs.llamaindex.ai/en/stable/module_guides/models/llms/modules.html
perfect thank you so much!
Add a reply
Sign up and join the conversation on Discord