Find answers from the community

Updated last year

Hello everyone

At a glance
Hello everyone,

I have a question. I am doing chatbot with my own docs. During that, I used chat_engine. I am getting answers. However, my answers are generated by 2 source nodes. I want to fetch more sources instead 2. Is this possible?
T
m
4 comments
You can adjust the amount with similarity_top_k

Plain Text
chat_engine = index.as_chat_engine(similarity_top_k=5, service_context=service_context)
Thank you so much 😊 I am just a new user for llamaindex. I have a general question also. I am using chat engine now. But what if I use query engine instead chat engine, will it has memory like in chat engine? Or is it possible to add memory to query engine?
It wont have memory by default, you can add it a few ways like sending each previous message with new ones but the chat engine is an easy way to implement the chat functionality: https://gpt-index.readthedocs.io/en/latest/core_modules/query_modules/chat_engines/root.html
Okay, thank you
Add a reply
Sign up and join the conversation on Discord