Find answers from the community

Updated 10 months ago

how to use NLSQLTableQueryEngine with

At a glance

The community member is trying to use the NLSQLTableQueryEngine with Ollama, but is encountering issues with configuring llama_index.core.Settings. Another community member suggests setting Settings.llm = Ollama(model="<model>", request_timeout=3000), but the original poster then encounters a ValueError related to loading the OpenAI embedding model. The community members discuss this error, noting that it seems to be an issue with the embeddings, and suggest setting up the embed model as well, either in the settings or as a keyword argument.

Useful resources
how to use NLSQLTableQueryEngine with Ollama? I got stuck trying to configure llama_index.core.Settings
L
j
6 comments
You just set it like Settings.llm = Ollama(model="<model>", request_timeout=3000)
Thanks @Logan M. Then when I run query_engine = NLSQLTableQueryEngine(sql_database=sql_db,synthesize_response=False) it raises the error below

ValueError:
**
Could not load OpenAI embedding model. If you intended to use OpenAI, please check your OPENAI_API_KEY.
Original error:
No API key found for OpenAI.
Please set either the OPENAI_API_KEY environment variable or openai.api_key prior to initialization.
API keys can be found or created at https://platform.openai.com/account/api-keys

Consider using embed_model='local'.
Visit our documentation for more embedding options: https://docs.llamaindex.ai/en/stable/module_guides/models/embeddings.html#modules
**
This is an error for embeddings. Do you have the full traceback? Doubting that that line of code is causing this
It looks like still trying to use Open AI. I also tried specifying the llm inside NLSQLTableQueryEngine but still got the same error
Huh, I guess it also requires an embed model. You'll need to set that up as well
Either in settings or as a kwarg
Add a reply
Sign up and join the conversation on Discord