The community member is trying to store a summary index in a vector store using ChromaDB, but it is not working. The comments explain that a summary index does not use embeddings, so there are no vectors to store. Instead, the summary index sends every node in the index to the language model. The community members suggest saving the summary index to disk or using a remote document store and index store like Redis or MongoDB. One community member also suggests storing the summaries as metadata in the nodes and then storing those in a vector database. The community members provide guidance on how to implement this approach.
Hello everyone, i am trying to store my summary index in a vector store using the storage context but it did not seems to work but using VectorIndex works perfectly well. I am using ChromaDB for my vector store.
that is new information for me, if it did not use embeddings what does it used? and do you have any suggestion on how to save the summary index because i am making an agent that make use of both VectorIndex and SummaryIndex
ahhh i see so i need an index store rather than vector store for SummaryIndex is that right? and also the summary index does not use neither llm or embeddings in order to make an index?
you would insert the parent_node summary into the meta data for each child node child_node.metadata["context_summary"] = str(parent_summary).lstrip().lower()