Hi, When I used the db.save_local("faiss_index") function to save my vectors, it deleted my old vector and stored the new created vector, but I want to save the new created vector in append mode so that I can utilise both my old and new created vectors. Can someone assist me with this? code: using langchain def StoreNewData(): all_text_data = extract_text_from_documents_in_directory(UPLOAD_DIR) text_splitter = RecursiveCharacterTextSplitter( chunk_size=1000, chunk_overlap=200, length_function=len ) chunks = text_splitter.split_text(text=all_text_data)
embeddings = OpenAIEmbeddings() db = FAISS.from_texts(chunks, embedding=embeddings) db.save_local("faiss_index")
I ran across a problem with the following code: llama_index import StorageContext, load_index_from_storage
Recreate the storage context
StorageContext.from_defaults(persist_dir='./storage') storage_context index load = load_index_from_storage(storage_context)
When I load a new document to generate a vector index, it creates a new vector index file, but I want to build a vector index in append mode, which stores my vector from the previous file as well as creates a vector index for the new file.
from llama_index import SimpleDirectoryReader, GPTVectorStoreIndex, LLMPredictor, PromptHelper, ServiceContext, StorageContext, load_index_from_storage from langchain import OpenAI import os import openai import gradio as gr
vectorIndex.storage_context.persist(persist_dir='Store') # issue in this line it over ride pre existing vector data i dont want to create vector of all document when a single document come in my Data directory. return vectorIndex