Find answers from the community

Updated 2 months ago

This is probably a stupid question but i

This is probably a stupid question but i dont know how to solve it. I am using llamaParse and then i am trying to upload the nodes into pinecone DB, but I am getting too much metadata with llamaParse so i am getting this error:

vector_store.add(nodes)

"Metadata size is 41191 bytes, which exceeds the limit of 40960 bytes per vector"

what can i do?
W
L
3 comments
You can exclude the metadata from llm and embed model operation with this:

Plain Text
# Iterate over the documents before inserting them into VectorStoreIndex

for doc in documents:
  doc.excluded_embed_metadata_keys = []
  doc.excluded_llm_metadata_keys = []


But tbh it is very strange that you have this much of metadata πŸ˜…
That pinecone is throwing error
0.04Mb limit -- tbh I'm surprised pinecone has that low of a limit lol
Add a reply
Sign up and join the conversation on Discord