hi i have. a curious bug, i have setup llamaindex with qdrant to read document. my code is pretty simple, but, the llm answer always is the context, and don't want to answer another question. i don't have the probleme without drant vector
Are you using OpenAI? That is typically caused by the underlying prompt template:
Plain Text
qa_prompt_str = (
"Context information is below.\n"
"---------------------\n"
"{context_str}\n"
"---------------------\n"
"Given the context information and not prior knowledge, "
"answer the question: {query_str}\n"
)