Find answers from the community

Updated last year

Knowledge Graph using Gemini

At a glance

The community member asks if it is possible to create a Knowledge Graph with llama-index using the Gemini() language model instead of OpenAI models. Another community member responds that they think the llm just needs to be replaced with the Gemini llm, providing sample code to set the global service context. However, they mention encountering errors when using the from_defaults method, and tried importing google_servic3_context as an alternative. The final comment indicates that the issue was resolved, but no explicit answer is provided.

Can we create Knowledge Graph with llama-index using Gemini() llm instead of openai models ??
W
a
4 comments
Yeah I think just need to replace the llm with Gemini llm.

Plain Text
from llama_index import set_global_service_context


llm = Gemini()
service_context = ServiceContext.from_defaults(llm=llm, embed_model='local')

#set context as global


set_global_service_context(service_context)

#proceed further with the graph
If we use from_defaults I was getting errors
I tried importing google_servic3_context and replaced with it too
Add a reply
Sign up and join the conversation on Discord