Find answers from the community

s
F
Y
a
P
Updated last month

I am using a customer retriever with a

I am using a customer retriever with a keyword Table retriever and a vector index retriever. Previously, I was making my indexes for these retrievers as I used them, but I have switched to indices which have been created elsewhere and persisted to storage. I have them stored and pulled from storage according to the persisting docs (i assigned ids and pull indexes off the id's), but when I query my retriever query engine I am now receiving this error when comparing embeddings. Its
Plain Text
ValueError: shapes (3072,) and (1536,) not aligned: 3072 (dim 0) != 1536 (dim 0). 
As 3072 is double 1536, I'm pretty sure its duplicating something somewhere but i'm not sure. If anyone knows as to why this happens please let me know. Code used to persist and load :
Plain Text
    vector_index = VectorStoreIndex(nodes, storage_context=storage_context)
    keyword_index = SimpleKeywordTableIndex(nodes, storage_context = storage_context)
    vector_index.set_index_id("slide_vector_index")
    keyword_index.set_index_id('slide_keyword_index')
    os.makedirs(f"/data/notes-bin/{file_name}/DocStore")
    vector_index.storage_context.persist(persist_dir=f"/data{file_name}/DocStore")
    keyword_index.storage_context.persist(persist_dir=f"/data/{file_name}/DocStore")
Plain Text
    storage_context = StorageContext.from_defaults(
    docstore=SimpleDocumentStore.from_persist_dir(persist_dir=f"/data{file_name}/DocStore"),
    vector_store=SimpleVectorStore.from_persist_dir(
        persist_dir=f"/data/{file_name}/DocStore", namespace="default"),
    index_store=SimpleIndexStore.from_persist_dir(persist_dir=f"/data/{file_name}/DocStore"),)


    vector_index = load_index_from_storage(storage_context, index_id="slide_vector_index")
    keyword_index = load_index_from_storage(storage_context, index_id="slide_keyword_index")
T
d
5 comments
Hmm have you used different OpenAI models for these? Those seem like the embedding dimensions for ada-002 and 3-large. They aren't compatible
hmm let me check, im relatively sure im only using 3-large but maybe i messed up somewhere
@Teemu thanks, im pretty sure thats it
Yeah if the old ones were done with ada-002 it wont work
@Teemu thanks that fixed it!
Add a reply
Sign up and join the conversation on Discord