Having some trouble using the
VectorIndexAutoRetriever
- it figured out the metadata correctly but it doesn't seem to make an embedding for the query, struggling to figure out why.
INFO:llama_index.core.indices.vector_store.retrievers.auto_retriever.auto_retriever:Using query str:
INFO:llama_index.core.indices.vector_store.retrievers.auto_retriever.auto_retriever:Using filters: [('topic', '==', 'TOPIC0')]
INFO:llama_index.core.indices.vector_store.retrievers.auto_retriever.auto_retriever:Using top_k: 10
Traceback (most recent call last):
File "ex1.py", line 79, in <module>
print(retr.retrieve("What is topic TOPIC0?"))
File "site-packages/llama_index/core/base/base_retriever.py", line 229, in retrieve
nodes = self._retrieve(query_bundle)
File "site-packages/llama_index/core/base/base_auto_retriever.py", line 37, in _retrieve
return retriever.retrieve(new_query_bundle)
File "site-packages/llama_index/core/base/base_retriever.py", line 229, in retrieve
nodes = self._retrieve(query_bundle)
File "site-packages/llama_index/core/indices/vector_store/retrievers/retriever.py", line 94, in _retrieve
return self._get_nodes_with_embeddings(query_bundle)
File "site-packages/llama_index/core/indices/vector_store/retrievers/retriever.py", line 170, in _get_nodes_with_embeddings
query_result = self._vector_store.query(query, **self._kwargs)
File "site-packages/llama_index/core/vector_stores/simple.py", line 273, in query
top_similarities, top_ids = get_top_k_embeddings(
File "site-packages/llama_index/core/indices/query/embedding_utils.py", line 30, in get_top_k_embeddings
similarity = similarity_fn(query_embedding_np, emb)
File "site-packages/llama_index/core/base/embeddings/base.py", line 47, in similarity
product = np.dot(embedding1, embedding2)
TypeError: unsupported operand type(s) for *: 'NoneType' and 'float'