Find answers from the community

Updated 3 months ago

It is impossible to use any other LLM

It is impossible to use any other LLM than OpenAI for querying SQL

Even though the manual says you can use ollama to load other LLMs such as Llama 2 and use the following commands to use it:
llm = Ollama(model="llama2")
service_context = ServiceContext.from_defaults(llm=llm)

As soon it finds and openai.api_key it just uses openAI and disregards the above. Also, it downloads the BAAI--bge-small-en model for no reason, as it cannot use it if we remove the open ai key, as it throws the message:
_Exception ignored in: <function _LlamaContext.del at 0x7f4d507ddaf0>
Traceback (most recent call last):
File "/root/anaconda3/lib/python3.9/site-packages/llama_cpp/llama.py", line 422, in del
TypeError: 'NoneType' object is not callable
Exception ignored in: <function _LlamaModel.del at 0x7f4d507dbb80>
Traceback (most recent call last):
File "/root/anaconda3/lib/python3.9/site-packages/llama_cpp/llama.py", line 240, in del
TypeError: 'NoneType' object is not callable
Exception ignored in: <function LlamaBatch.del at 0x7f4d507dfe50>
Traceback (most recent call last):
File "/root/anaconda3/lib/python3.9/site-packages/llama_cpp/llama.py", line 670, in del
TypeError: 'NoneType' object is not callable
W
s
L
16 comments
ServiceContext requires two models
One for response generation and other for creating embeddings.

You have only provided the llm part so it falls down to OpenAI for embedding model.

Since it is downloading the embed model on its I suspect you are using old version.

Any way you can fix it by providing the embed model.

Plain Text
service_context = ServiceContext.from_defaults(llm=llm, embed_model='local')
I upgraded llamaindex and now it just tells me to put llm=None and uses a mockLLM, and still throws the same error
You'll need to pass llm if you want to generate response for your query.
I have:
llm = Ollama(model="llama2")
service_context = ServiceContext.from_defaults(llm=llm, embed_model='local')
This is giving you error?
Are you passing the service context into the things you are using?
from llama_index.llms import Ollama
from sqlalchemy import select, create_engine, MetaData, Table
from llama_index import SQLDatabase
from llama_index import StorageContext, load_index_from_storage
from llama_index import VectorStoreIndex, SimpleDirectoryReader
from llama_index.llms import Ollama
from llama_index import ServiceContext
from llama_index.indices.struct_store.sql_query import NLSQLTableQueryEngine
from IPython.display import Markdown, display

engine = create_engine('mariadb+pymysql://bunchofstuff')
metadata = MetaData()
table = Table("users", metadata, autoload_with=engine)
stmt = select(table.columns)
with engine.connect() as connection:
results = connection.execute(stmt).fetchone()

llm = Ollama(model="llama2")
service_context = ServiceContext.from_defaults(llm=llm, embed_model='local')
sql_database = SQLDatabase(engine)
prompt="show how many users are there in the database"
query_engine = NLSQLTableQueryEngine(sql_database=sql_database,tables=["users"],)

try:
response = query_engine.query(prompt)
print (response)
response_md = str(response)
sql_query = response.metadata["sql_query"]
print("sql_query=",sql_query)
print("response_md=",response_md)
except Exception as ex:
response_md = "Error"
display(Markdown(f"ERROR: {str(ex)}"))
it is supposed to read a mariadb table, load llama2 using ollama, and show how many records are there in the table users
Yea you need to use the service context in the query engine

query_engine = NLSQLTableQueryEngine(sql_database=sql_database, tables=["users",], service_context=service_context)
thanks, it just shows now "<IPython.core.display.Markdown object>"
it is not loading any model: I put llm = Ollama(model="mistralwwwwww") and same message
Because you have a try except (it's probably trying to print your error display thing)
Now it shows ConnectionRefusedError: [Errno 111] Connection refused. It cannot be to the database, as I am able to fetch one record and it worked when using openai
"Sure! Here's a possible response based on the information provided:
According to the query results, there are 20915 users in the database." πŸ™‚
I forgot to load the ollama server
Thanks to everyone who helped, God bless
Add a reply
Sign up and join the conversation on Discord