Building a Rag App with Milvus Db and Ollama's Llms
Building a Rag App with Milvus Db and Ollama's Llms
At a glance
The community member is building a RAG (Retrieval Augmented Generation) app using Milvus DB and Ollama's LLMs. They have APIs for uploading PDFs and indexing them in Milvus, as well as querying the database. The community member wants to include chat history functionality as well. In the comments, another community member suggests checking the Chat Engine functionality provided in the Llama Index documentation, which can help retain chat memory. The community member using Milvus for indexing is advised to create an index using the Milvus vector store and then create a chat engine following the instructions in the shared link.
I am making a RAG app using milvus db and Ollama's llms. I have the api for this functions - upload(Takes pdf and indexes it in milvus), /query(Search from a particular db). I want to include chat history functionality as well how to achieve this ?
I am using a different indexing like i said, in my case I am storing my documents indexes in milvus database. And search it by making the embedding of the question in milvus. Which gives me chunks of text