Find answers from the community

Updated 9 months ago

how to use NLSQLTableQueryEngine with

how to use NLSQLTableQueryEngine with Ollama? I got stuck trying to configure llama_index.core.Settings
L
j
6 comments
You just set it like Settings.llm = Ollama(model="<model>", request_timeout=3000)
Thanks @Logan M. Then when I run query_engine = NLSQLTableQueryEngine(sql_database=sql_db,synthesize_response=False) it raises the error below

ValueError:
**
Could not load OpenAI embedding model. If you intended to use OpenAI, please check your OPENAI_API_KEY.
Original error:
No API key found for OpenAI.
Please set either the OPENAI_API_KEY environment variable or openai.api_key prior to initialization.
API keys can be found or created at https://platform.openai.com/account/api-keys

Consider using embed_model='local'.
Visit our documentation for more embedding options: https://docs.llamaindex.ai/en/stable/module_guides/models/embeddings.html#modules
**
This is an error for embeddings. Do you have the full traceback? Doubting that that line of code is causing this
It looks like still trying to use Open AI. I also tried specifying the llm inside NLSQLTableQueryEngine but still got the same error
Huh, I guess it also requires an embed model. You'll need to set that up as well
Either in settings or as a kwarg
Add a reply
Sign up and join the conversation on Discord