Find answers from the community

Updated 10 months ago

Hi, guys! I recently upgraded the llama-

Hi, guys! I recently upgraded the llama-index from v0.9 to v0.10. When I run "from llama_index.vector_stores.postgres import PGVectorStore", it returns the error " ImportError: cannot import name 'DEFAULT_PERSIST_FNAME' from partially initialized module 'llama_index.core.vector_stores.simple' (most likely due to a circular import)"
L
r
a
19 comments
I'm not able to reproduce πŸ‘€

python -c "from llama_index.vector_stores.postgres import PGVectorStore"

Maybe start with a fresh venv? (Its generally recommended when updating to v0.10.x)
Thank u. This works now. However, I got another error when "from llama_index.core.query_engine import RetrieverQueryEngine".....The error message is "ModuleNotFoundError: No module named 'llama_index.core.image_retriever'"
Did you create a fresh venv? For example, in a fresh terminal

Plain Text
# remove any global install
pip uninstall llama-index llama-index-vector-stores-postgres

# install in a specific venv
python -m venv venv
source venv/bin/activate
pip install llama-index llama-index-vector-stores-postgres
Well, I created a new env in conda, in order to install the llama-index. Howvever, when I run the "pip install llama-index", it failed due to some confilcts. ....
Plain Text
 ERROR: Cannot install llama-index-cli because these package versions have conflicting dependencies.

The conflict is caused by:
    llama-index-vector-stores-chroma 0.1.5 depends on onnxruntime<2.0.0 and >=1.17.0
    llama-index-vector-stores-chroma 0.1.4 depends on onnxruntime<2.0.0 and >=1.17.0
    llama-index-vector-stores-chroma 0.1.3 depends on onnxruntime<2.0.0 and >=1.17.0
    llama-index-vector-stores-chroma 0.1.2 depends on onnxruntime<2.0.0 and >=1.17.0

To fix this you could try to:
1. loosen the range of package versions you've specified
2. remove package versions to allow pip attempt to solve the dependency conflict
some people have run into this, and I don't really get how/why πŸ˜…

I wouldn't use conda
No worries. I just fixed this error by running "conda install onnxruntime". I am not sure where the other people posted this error, but hope this will be helpful for them
what version of weavaite supported in 0.10.xx version
i get this error ModuleNotFoundError: No module named 'weaviate.classes'
v4.0 of weaviate
You can install llama-index-vector-stores-weaviate<1.0 if you want 3.0
thanks @Logan M . I was able to upgrade weavate it works now .
which api takes in token counter now. It used be wrapped in service context but I guess now I will have to pass it in individually
@Logan M is it possible to inject call back managers , I looked at the code it does not look like that is supported right now. I need a way to count token in each thread in the flask application ? Is there a code snippet that i can take a look at
Shared/Global objects have this drawback of not being thread safe unless some locks are implemented and i dont think Settings object supports that usecase at this point
llm = OpenAI(..., callback_manager=callback_manager)
i was playing around with passing call manager to chat engine but it looked like it was either picking from service context or from Settings
will try the one you suggested
Add a reply
Sign up and join the conversation on Discord