Find answers from the community

Home
Members
thegallier
t
thegallier
Offline, last seen 3 months ago
Joined September 25, 2024
i am trying the multi index but trying different llm
query_engine = summary_index.as_query_engine(
service_context=ServiceContext.from_defaults(
#llm=OpenAI(model="gpt-3.5-turbo")
#llm=Ollama("mistral")
llm=None
)
)
Still getting the below error (both None and Ollama)
-----
During handling of the above exception, another exception occurred:

ValueError Traceback (most recent call last)
/usr/local/lib/python3.10/dist-packages/llama_index/llms/utils.py in resolve_llm(llm)
29 validate_openai_api_key(llm.api_key)
30 except ValueError as e:
---> 31 raise ValueError(
32 "\n**\n"
33 "Could not load OpenAI model. "

ValueError:
**
Could not load OpenAI model. If you intended to use OpenAI, please check your OPENAI_API_KEY.
Original error:
No API key found for OpenAI.
Please set either the OPENAI_API_KEY environment variable or openai.api_key prior to initialization.
API keys can be found or created at https://platform.openai.com/account/api-keys

To disable the LLM entirely, set llm=None.
**

Any suggestions?
2 comments
W