kind of! llamaindex is a central interface for external data. but this could mean that we are also a plugin that chatgpt (or any agent) can call. but at the same time we build out integrations with external data stores as well (which is why we have data loaders + integrations with weaviate/pinecone)
Hey @oguntadej , thanks for flagging this. We're working on making the UX better - in the meantime try defining a pinecone vector store object: https://github.com/jerryjliu/gpt_index/blob/main/gpt_index/vector_stores/pinecone.py, and then passing this is as a vector store argument when you initialize GPTPineconeIndex e.g. index = GPTPineconeIndex(documents, ..., vector_store=vector_store)
Hey everyone! If you've 1) contributed a PR to llamaindex/gpt-index or llamahub, or 2) have helped out in the channels, or 3) have built an app, and you don't already have an assigned role in the Discord, let me know and I can add you to a Discord role! π
Currently we manually put together an initial list but we know there's more of you out there!
yes they do! i just realized it's not reflected in the api docs: https://gpt-index.readthedocs.io/en/latest/reference/indices/vector_store_query.html - i will put out a fix. we have a default refine prompt under the hood; if you want to customize in the meantime use GPTSimpleVectorIndexQuery as a reference (e.g. do index.query(..., refine_template=custom_refine_template))