Find answers from the community

Updated 3 months ago

Ollama

Hello, I'm having issues with setting OllamaEmbedding
Plain Text
from llama_index.core.indices.vector_store.base import VectorStoreIndex
from llama_index.legacy.vector_stores.qdrant import QdrantVectorStore
import qdrant_client
from llama_index.core import Settings
from llama_index.legacy.embeddings.ollama_embedding import OllamaEmbedding
from llama_index.legacy.embeddings import LangchainEmbedding
from langchain.embeddings import OllamaEmbeddings

# Initialize Ollama embedding model
embed_model = OllamaEmbedding(model_name="nomic-embed-text", base_url="http://localhost:11434")

# Set the global embedding model
Settings.embed_model = embed_model


Plain Text
---------------------------------------------------------------------------
AssertionError                            Traceback (most recent call last)
Cell In[48], line 13
     10 embed_model = OllamaEmbedding(model_name="nomic-embed-text", base_url="http://localhost:11434")
     12 # Set the global embedding model
---> 13 Settings.embed_model = embed_model

File ~/Library/Caches/pypoetry/virtualenvs/ollama-env-Rz8XYqBf-py3.12/lib/python3.12/site-packages/llama_index/core/settings.py:74, in _Settings.embed_model(self, embed_model)
     71 @embed_model.setter
     72 def embed_model(self, embed_model: EmbedType) -> None:
     73     """Set the embedding model."""
---> 74     self._embed_model = resolve_embed_model(embed_model)

File ~/Library/Caches/pypoetry/virtualenvs/ollama-env-Rz8XYqBf-py3.12/lib/python3.12/site-packages/llama_index/core/embeddings/utils.py:136, in resolve_embed_model(embed_model, callback_manager)
    133     print("Embeddings have been explicitly disabled. Using MockEmbedding.")
    134     embed_model = MockEmbedding(embed_dim=1)
--> 136 assert isinstance(embed_model, BaseEmbedding)
    138 embed_model.callback_manager = callback_manager or Settings.callback_manager
    140 return embed_model

AssertionError: 
L
1 comment
Don't mix legacy and non-legacy imports.

Ideally, you'd do

pip install llama-index-vector-stores-qdrant llama-index-embeddings-ollama


Plain Text
from llama_index.embeddings.ollama import OllamaEmbedding
from llama_index.vector_stores.qdrant import QdrantVectorStore 
Add a reply
Sign up and join the conversation on Discord