Find answers from the community

Updated last year

Hi everyone where is a good tutorial on

Hi everyone, where is a good tutorial on how to use a docstore + vector db and an LLM to generate reports instead of chatbot? Each week, we gather documents from various sources and insert those documents into our chromadb. now I want to create a report that describes/summarizes/elevates the posts that came in over the week. So I'd like to have the LLM go over the posts for the week and generate a summary and then I want to output the top 5 or 10 posts for the week. Is this where something like a pydantic data mode is used? Do I query chroma for the posts of the week and then pass those document ids to the LLM or does the LLM do that? any wisdom or recommendations are greatly appreciated! thanks!
c
t
3 comments
Sounds like a general approach could be:

  • generate a summary of each document and store that in an index
  • create a chroma document for each post in some pipeline, and create an index just for posts.
  • use these two indexes and generate a parent index of all of them
Use the QueryToolMetadata class with an agent to figure out which tool is used corresponding to the right index. Some prompt engineering here could help refine the results returned by each one.
Leverage response.source_nodes to get the actual TextNode representation of the posts (whatever you stored in ChromaDB) and you can filter the returned node fields
Thank you @cmagorian I've never heard of creating an index of just summaries. I haven't learned about the concept of creating a 'parent' index from other indices. Do you have any reading material that explains these concepts more thoroughly? I don't understand how this scheme works to solves problems. Looking for forward to any details! cheers,
Add a reply
Sign up and join the conversation on Discord