Hi using ollama LLM .
llm = Ollama(model="mistral", request_timeout=30.0)
service_context = ServiceContext.from_defaults(
llm=llm, callback_manager=callback_manager, chunk_size=256,
embed_model="local")
*retriever = VectorIndexAutoRetriever(
index,
vector_store_info=vector_store_info,
service_context=service_context,
max_top_k=10000,
)
query_engine = RetrieverQueryEngine.from_args(retriever,llm=llm)
the problem is still the LLM resolver is looking for open ai keys. when i checked the resolver function it is checking for lang chain and open ai and not directly the ollama.
File ~/anaconda3/lib/python3.11/site-packages/llama_index/llms/utils.py:31 in resolve_llm
raise ValueError(
ValueError:
**Could not load OpenAI model. If you intended to use OpenAI, please check your OPENAI_API_KEY.
Original error:
No API key found for OpenAI.
Please set either the OPENAI_API_KEY environment variable or openai.api_key prior to initialization.
API keys can be found or created at
https://platform.openai.com/account/api-keysTo disable the LLM entirely, set llm=None.
**Could you please help wether the issue is from my code ? is there documents available for using ollama. Thanks .