Find answers from the community

Home
Members
jerryjliu0
j
jerryjliu0
Offline, last seen 2 months ago
Joined September 25, 2024
kind of! llamaindex is a central interface for external data. but this could mean that we are also a plugin that chatgpt (or any agent) can call. but at the same time we build out integrations with external data stores as well (which is why we have data loaders + integrations with weaviate/pinecone)
2 comments
j
3
j
jerryjliu0
Β·

flowllama

this is cool! Could be interesting to do something native for llamaindex if anyone is interested πŸ™‚
3 comments
M
4
j
jerryjliu0
Β·

Recording

Hey everyone! Thanks for those who joined the fireside chat - it was a good discussion!

I accidentally messed up the recording πŸ˜… , but will type up a summary of notes that I'll share with the group
4 comments
G
j
just a heads up, we're taking a look at this soon! πŸ™‚
3 comments
H
j
s
Hey @oguntadej , thanks for flagging this. We're working on making the UX better - in the meantime try defining a pinecone vector store object: https://github.com/jerryjliu/gpt_index/blob/main/gpt_index/vector_stores/pinecone.py, and then passing this is as a vector store argument when you initialize GPTPineconeIndex e.g. index = GPTPineconeIndex(documents, ..., vector_store=vector_store)
7 comments
o
j
oh weird. @Krrish@LiteLLM.ai i thought this was working, but this may be a bug on our end. let me take a lok asap
2 comments
K
j
Hey everyone! If you've 1) contributed a PR to llamaindex/gpt-index or llamahub, or 2) have helped out in the channels, or 3) have built an app, and you don't already have an assigned role in the Discord, let me know and I can add you to a Discord role! πŸ‘‘

Currently we manually put together an initial list but we know there's more of you out there!
3 comments
t
j
4
we chatted offline, but FYI to anyone else facing this issue: try pip install python-magic-bin==0.4.14
6 comments
h
b
L
define a custom PromptHelper and set max_chunk_overlap=0 (you can see an example here https://gpt-index.readthedocs.io/en/latest/how_to/custom_llms.html)
4 comments
M
j
yes they do! i just realized it's not reflected in the api docs: https://gpt-index.readthedocs.io/en/latest/reference/indices/vector_store_query.html - i will put out a fix. we have a default refine prompt under the hood; if you want to customize in the meantime use GPTSimpleVectorIndexQuery as a reference (e.g. do index.query(..., refine_template=custom_refine_template))
5 comments
j
m
Try upping similarity_top_k during the query call (by default it’s 1)
5 comments
K
y