Find answers from the community

Updated 2 years ago

jerryjliu98 9313 have you thought at

@jerryjliu0 have you thought at all before about caching queries?

Have a cool PoC for semantic query caching via pinecone (could use the vector index instead) rn and I feel like there might be a place in gpt_index to maybe slot this in as opposed to shipping an external library.
j
y
12 comments
i've been thinking about it! what did you have in mind more specifically? like be able to re-use the exact same query? or also make use of related queries?
Well, right now if the query has a cosine similarity of >0.9 it uses a cached answer for a previously answered question
Basically for a Q/A app, no need to answer similar questions twice via LLM calls
Just cache and hit the cache by comparing incoming embeddings with a vector store
So, related queries, not exact same
And I am going to seed the cache with high-quality answers via my hypothetical Q/A mechanism I mentioned previously, such that the majority of questions are answered nice, because vanilla gpt_index queries are lackluster on this dataset I am using (mostly because I havent optimized them).
There's also like 3-4 different queries I want to cache (only caching 1 right now), so was going to implement as a function decorator.
thanks. it seems like this cache is more about approximate similarity than exact similarity. i was thinking about something similar w.r.t a query cache of previous questions/answers. i'm assuming you'd want the ability to manually "seed" the cache though?
Yeah, exact similarity is nice (and easy, redis will do that), but humans dont always ask the same exact questions
And if the answer will be the same anyways, the idea here is to not answer it again
And yes, useful feature to be able to seed, though I guess "seeding" here could just be creating a new SimpleVectorIndex at runtime with documents (or loading it from disk)
I also incorporated a human feedback element such that negatively-scored cached answers bust and result in new answers.
Add a reply
Sign up and join the conversation on Discord