Find answers from the community

Home
Members
graven raidill
g
graven raidill
Offline, last seen 3 months ago
Joined September 25, 2024
I'm curious if llma-index offers tools for creating a summarization app that doesn't necessarily depend on llm's. I'm specifically interested in vector-based solutions or smaller summarization models.

Handling large texts often involves chunking and overlaps, which can be challenging to implement independently outside of Llama-index or Langchain. Additionally, I'm exploring how llama-index implements both extractive and abstractive summarization. Thanks In advance!!
2 comments
g
L
Hi! I'm working on a RAG app, and my current statck is Huggingface LLM, Chroma and LlamaIndex. What structure will be most practical? I'm thinking to have a separate backend logic for chroma endpoints, then a separate one for an IndexManager for all index endpoints?
2 comments
g
W
I stored my index using chroma:
Plain Text
chroma_client = chromadb.PersistentClient(path="chroma")
chroma_collection = chroma_client.get_or_create_collection("cardcom_collection")
vector_store = ChromaVectorStore(chroma_collection=chroma_collection)
storage_context = StorageContext.from_defaults(vector_store=vector_store)
index = VectorStoreIndex.from_documents(documents, storage_context=storage_context, service_context=service_context)

How do I load it from another module?
2 comments
g
L
Hi, suddenly I can no longer use VectorStoreIndex. It's already imported, my whole code is working before. My llama-index version is 0.8.45.post1 can someone help? This is most likely a package issue since I got my whole code already working yesterday.
7 comments
g
L
Hi, do I have to rebuild my index every time I make changes to the collection like adding a new document? what is the fastest way of doing this?
4 comments
W
g
I notice that using VectorStoreIndex as retriever has faster LLM generation as compared to using a vector database like chroma. This is noticeable specially if you are running open-source LLM like Llama 2. Based on this observation I am likely to drop chroma out of the pipeline. Can someone change my mind?
2 comments
g
W