Find answers from the community

Updated 2 months ago

Langchain

Hi, When I used the db.save_local("faiss_index") function to save my vectors, it deleted my old vector and stored the new created vector, but I want to save the new created vector in append mode so that I can utilise both my old and new created vectors. Can someone assist me with this?
code: using langchain
def StoreNewData():
all_text_data = extract_text_from_documents_in_directory(UPLOAD_DIR)
text_splitter = RecursiveCharacterTextSplitter(
chunk_size=1000,
chunk_overlap=200,
length_function=len
)
chunks = text_splitter.split_text(text=all_text_data)

embeddings = OpenAIEmbeddings()
db = FAISS.from_texts(chunks, embedding=embeddings)
db.save_local("faiss_index")

StoreNewData()
sys.exit()
L
A
5 comments
Hmm, yea not 100% how it works for langchain stuff (this is a llama index discord lol)

Maybe there's some kind of insert function? Might have to read some docs or source code
Actually, I am a beginner in Python, and I am confused by my current project.
Which integrate chatgpt with own data and train it with all functionality like memory to store current conversation
is it possible to do using llamaIndex with large number of data
Yup, this is possible with llama index! I recommend reading some of our docs
https://gpt-index.readthedocs.io/en/latest/getting_started/concepts.html

And then also the agents will help with the conversation stuff
https://gpt-index.readthedocs.io/en/latest/core_modules/agent_modules/agents/root.html
Add a reply
Sign up and join the conversation on Discord