Find answers from the community

Updated 2 days ago

Building a Rag App with Milvus Db and Ollama's Llms

I am making a RAG app using milvus db and Ollama's llms. I have the api for this functions - upload(Takes pdf and indexes it in milvus), /query(Search from a particular db). I want to include chat history functionality as well how to achieve this ?
W
m
4 comments
You need to check Chat Engine functionality provided here: https://docs.llamaindex.ai/en/stable/module_guides/deploying/chat_engines/modules/

You can pick any of these modules and all of them help you retain chat memory
I am using a different indexing like i said, in my case I am storing my documents indexes in milvus database.
And search it by making the embedding of the question in milvus. Which gives me chunks of text
But in the example like you have given they use llama_index's index as document store
You can customize it. You can create index using milvus vector store: https://docs.llamaindex.ai/en/stable/examples/vector_stores/MilvusIndexDemo/

and then create chat engine following the first shared link
Add a reply
Sign up and join the conversation on Discord