Find answers from the community

Updated last year

Upgrade

Hello mates,

I was trying some stuff with llama, and it was working well, but then I upgraded the libraries to latest ones.
My python version is 3.10
But llama stopped working and started throwing bulk of errors in console.
Then I downgraded to follwoing

langchain==0.0.174
llama-index==0.6.9

Now it is working well again.

May be this solution can help few people facing same issue.

But I want to understand why this happened
L
A
c
28 comments
Oooo, 0.6.9 is quite old. The library has changed quite a bit since then πŸ˜…
There will be a few key changes
I can help migrate the code of its not too big
Just few lines. very basic.
Couple of lines
Yea paste it in here πŸ’ͺ
import os
import openai
from flask import Flask, jsonify, request, make_response
from llama_index import StorageContext, load_index_from_storage
from flask_cors import CORS, cross_origin

os.environ["OPENAI_API_KEY"] = 'sk-**'
storage_context = StorageContext.from_defaults(persist_dir='./storage')
indexMASTER = load_index_from_storage(storage_context)
query_engine = indexMASTER.as_query_engine()


app = Flask(name)
CORS(app)

@app.route('/', methods=['GET', 'POST'])
@cross_origin()
def test():
if request.method == 'POST':
data = request.json
statement = data[ 'statement' ]
context = data['context']
print(statement)
index_results = query_engine.query(f"{context} /n /n {statement}")
print(index_results)
index_results = str(index_results)
return jsonify({'response':{'a':index_results}})
else:
return jsonify({'response':'Nothing here'})

if name == 'main':
app.run()
I had a similar issue, are you using a virtual environment? I had to restore the venv from zero and also I'm using python 3.10. Hope you get it.
Ah tbh this looks good. But you can't use the same saved index I think? Or I guess what kind of error are you getting?
So I have uploaded this flask app to pythonanywhere.com

When I updated the llama-index to latest version I started getting the 500 internal server error.

Then I have GoDaddy server, namecheap server, and I also have my own reseller hosting server. I have tried installing the llama-index on all of these servers but it is showing the following error I have attahced in this image.

These serves support python 3.9
Attachment
image.png
That seems like an issue with the server support for numpy libraries?
Are you deploying with a docker image?
What you mean by same inedx Bro?
If you are referencing to saved vecotor store files, so I am using it like that and it works well.
So I just vectored my document, saved it in the storage folder, then I just use it
No, I am just creating a standard python app using cpannel. Then we get a virtual enviourment as well. So I just activate that and then using the terminal I tried installing the libraries.
Not sure what to tell you, just seems like a general incompatibility with numpy. Probably best to follow the link the error message is giving πŸ™‚
On pythonanywhere.com, when I upgrade the llama-index to latest one
I see something like below

__


Traceback (most recent call last):
File "/home/Ahtasham/mysite/app.py", line 4, in <module>
from llama_index import StorageContext, load_index_from_storage
File "/home/Ahtasham/.local/lib/python3.10/site-packages/llama_index/init.py", line 12, in <module>
from llama_index.data_structs.struct_type import IndexStructType
File "/home/Ahtasham/.local/lib/python3.10/site-packages/llama_index/data_structs/init.py", line 3, in <module>
from llama_index.data_structs.data_structs import (
File "/home/Ahtasham/.local/lib/python3.10/site-packages/llama_index/data_structs/data_structs.py", line 14, in <module>
from llama_index.schema import BaseNode, TextNode
File "/home/Ahtasham/.local/lib/python3.10/site-packages/llama_index/schema.py", line 15, in <module>
from llama_index.bridge.langchain import Document as LCDocument
File "/home/Ahtasham/.local/lib/python3.10/site-packages/llama_index/bridge/langchain.py", line 1, in <module>
import langchain
File "/home/Ahtasham/.local/lib/python3.10/site-packages/langchain/init.py", line 6, in <module>
from langchain.agents import MRKLChain, ReActChain, SelfAskWithSearchChain
File "/home/Ahtasham/.local/lib/python3.10/site-packages/langchain/agents/init.py", line 40, in <module>
from langchain.agents.agent_toolkits import (
File "/home/Ahtasham/.local/lib/python3.10/site-packages/langchain/agents/agent_toolkits/init.py", line 13, in
<module>
from langchain.agents.agent_toolkits.csv.base import create_csv_agent
File "/home/Ahtasham/.local/lib/python3.10/site-packages/langchain/agents/agent_toolkits/csv/base.py", line 4, in <module>
from langchain.agents.agent_toolkits.pandas.base import create_pandas_dataframe_agent
File "/home/Ahtasham/.local/lib/python3.10/site-packages/langchain/agents/agent_toolkits/pandas/base.py", line 18, in <module>
from langchain.agents.types import AgentType
File "/home/Ahtasham/.local/lib/python3.10/site-packages/langchain/agents/types.py", line 5, in <module>
from langchain.agents.chat.base import ChatAgent
File "/home/Ahtasham/.local/lib/python3.10/site-packages/langchain/agents/chat/base.py", line 4, in <module>
from langchain.agents.chat.output_parser import ChatOutputParser
File "/home/Ahtasham/.local/lib/python3.10/site-packages/langchain/agents/chat/output_parser.py", line 12, in <module>
class ChatOutputParser(AgentOutputParser):
File "pydantic/main.py", line 228, in pydantic.main.ModelMetaclass.new
File "pydantic/fields.py", line 488, in pydantic.fields.ModelField.infer
File "pydantic/fields.py", line 419, in pydantic.fields.ModelField.init
File "pydantic/fields.py", line 539, in pydantic.fields.ModelField.prepare
File "pydantic/fields.py", line 801, in pydantic.fields.ModelField.populate_validators
File "pydantic/validators.py", line 723, in find_validators
RuntimeError: no validator found for <class 're.Pattern'>, see arbitrary_types_allowed in Config
I understand, The above error is from python 3.10/ server is pyhtonanywhere.com
and it happens when I update to latest llama-index
But the numpy incompatibilty error occurs on my godaddy, namecheap or personal hosting servers
This error seems related to maybe your langchain version or pydantic version? For reference, locally I have
Plain Text
llama-index==0.8.13
pydantic==1.10.12
langchain==0.0.277
My local machine has

llama-index 0.8.12
pydantic 1.10.7
langchain 0.0.275
and it is working well
This is on my pythonanyehere server

langchain 0.0.174
llama-index 0.6.9
pydantic 1.9.0
The code on pythonanyehere only work this combination of lanchain and llama-index.
If i upadte these two libraries it breaks
well, idk man, I'm lost lol

I would recommend deploying somewhere that supports a docker container, less issues with this kind of stuff, easier to debug
No worries I will try with you recommended solution
Can you suggest me somthing which service I should that supports docker container
I think the most general choice is AWS? But its also a little complex. I'm sure some other providers out there may have easier solutions (likely at the cost of flexibility)
Well great help. I sorted out the issues. It was all becasue of the compatibility of different libraries. But yes, finally i made it working
Add a reply
Sign up and join the conversation on Discord