The community member asks if it is possible to create a Knowledge Graph with llama-index using the Gemini() language model instead of OpenAI models. Another community member responds that they think the llm just needs to be replaced with the Gemini llm, providing sample code to set the global service context. However, they mention encountering errors when using the from_defaults method, and tried importing google_servic3_context as an alternative. The final comment indicates that the issue was resolved, but no explicit answer is provided.
Yeah I think just need to replace the llm with Gemini llm.
Plain Text
from llama_index import set_global_service_context
llm = Gemini()
service_context = ServiceContext.from_defaults(llm=llm, embed_model='local')
#set context as global
set_global_service_context(service_context)
#proceed further with the graph