Following this development with interest (but still short on some foundational understanding): If a user has already chunked their data into <4K token chunks and performed embeddings using the OpenAI embeddings API, and subsequently stored these into something like Pinecone for semantic search match, how does GPT_Index add functionality on top of this?