Find answers from the community

Home
Members
Yamil Abraham
Y
Yamil Abraham
Offline, last seen 3 months ago
Joined September 25, 2024
@Logan M bad news again 🥲

trying to run my python script using these dependecies:
Plain Text
llama-index-core = "^0.11.0.post1"
llama-index-readers-file = "^0.2.0"
llama-index-readers-s3 = "^0.2.0"
llama-index-embeddings-voyageai = "^0.2.0"
llama-index-vector-stores-milvus = "^0.2.0"
llama-index-multi-modal-llms-anthropic = "^0.2.0"
llama-index-llms-anthropic = "^0.2.0"
pydantic-settings = "^2.4.0"

at the end of terminal have this result:
Plain Text
AttributeError: 'Anthropic' object has no attribute '__pydantic_private__'. Did you mean: '__pydantic_complete__'?

which version of pydantic-settings should I have to use?
4 comments
Y
L
hey guys! I'm trying to use Milvus vector store with hybrid retrieval and I having issues with BGEM3F embedding model:
Plain Text
pip install FlagEmbedding

I create my vector store:
Plain Text
        vector_store = MilvusVectorStore(
            collection_name=app_settings.milvus_collection,
            dim=app_settings.vector_dim,
            overwrite=app_settings.overwrite,
            token=app_settings.milvus_token,
            uri=app_settings.milvus_uri,
            enable_sparse=app_settings.enable_sparse,
            hybrid_ranker=app_settings.hybrid_ranker,
            hybrid_ranker_params=app_settings.hybrid_ranker_params,            
        )

and then I have this in my terminal:
Plain Text
DEBUG:pymilvus.milvus_client.milvus_client:Created new connection using: d28ca022b4884f4ab606408820a88944
WARNING:llama_index.vector_stores.milvus.base:Sparse embedding function is not provided, using default.
INFO:datasets:PyTorch version 2.4.0 available.
CRITICAL:llama_index.vector_stores.milvus.utils:Cannot import BGEM3FlagModel from FlagEmbedding. It seems it is not installed. Please install it using:
pip install FlagEmbedding


but in fact it has already been installed
10 comments
Y
L
hey guys! anyone has this same error trying to use VoyageAI Rerank?

it seems that there is a dependency conflict between llama-index-embeddings-voyageai and llama-index-postprocessor-voyageai-rerank. The former requires a version of voyageai less than 0.2.0, while the latter requires a version greater than or equal to 0.2.1.


Plain Text
 ~  pip install llama-index-postprocessor-voyageai-rerank

Installing collected packages: voyageai, llama-index-postprocessor-voyageai-rerank
Successfully installed llama-index-postprocessor-voyageai-rerank-0.1.2 voyageai-0.2.3
Plain Text
 ~  pip install llama-index-embeddings-voyageai

Installing collected packages: voyageai, llama-index-embeddings-voyageai
Attempting uninstall: voyageai
Found existing installation: voyageai 0.2.3
Uninstalling voyageai-0.2.3:
Successfully uninstalled voyageai-0.2.3
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
llama-index-postprocessor-voyageai-rerank 0.1.2 requires voyageai<0.3.0,>=0.2.1, but you have voyageai 0.1.7 which is incompatible.
Successfully installed llama-index-embeddings-voyageai-0.1.4 voyageai-0.1.7
11 comments
Y
L