Find answers from the community

Updated 7 months ago

Hello, there is any advantaje with

Hello, there is any advantaje with Condense Plus Context versus Condense Question Mode for standalone question generation ?

https://docs.llamaindex.ai/en/stable/examples/chat_engine/chat_engine_condense_question/
https://docs.llamaindex.ai/en/stable/examples/chat_engine/chat_engine_condense_plus_context/

I'm using this chat with a knowledge base
W
x
3 comments
Condense plus Context means it will be able to answer questions which are out of the knowledge base that you have provided. It will use LLM knowledge to answer as well.

Condense on the other hand will try to restrict itself to answer using the knowledge base.
so if I want standalone question generation to use only knowledge base I should use Condense ? What is not clear for me if standalone question generation will be the same on both or will be different ? I don't want to answer outside knowledge base, but I notice that there are different questions generated for this chat modes
Yeah for condense, the first llm call is to read back previous queries and generate a more suitable query that will be implied on your dataset.
Add a reply
Sign up and join the conversation on Discord