This is probably a stupid question but i dont know how to solve it. I am using llamaParse and then i am trying to upload the nodes into pinecone DB, but I am getting too much metadata with llamaParse so i am getting this error:
vector_store.add(nodes)
"Metadata size is 41191 bytes, which exceeds the limit of 40960 bytes per vector"
You can exclude the metadata from llm and embed model operation with this:
Plain Text
# Iterate over the documents before inserting them into VectorStoreIndex
for doc in documents:
doc.excluded_embed_metadata_keys = []
doc.excluded_llm_metadata_keys = []
But tbh it is very strange that you have this much of metadata π