Find answers from the community

Updated 3 months ago

hi i have. a curious bug, i have setup

hi i have. a curious bug, i have setup llamaindex with qdrant to read document. my code is pretty simple, but, the llm answer always is the context, and don't want to answer another question. i don't have the probleme without drant vector
T
c
L
6 comments
Are you using OpenAI? That is typically caused by the underlying prompt template:

Plain Text
qa_prompt_str = (
    "Context information is below.\n"
    "---------------------\n"
    "{context_str}\n"
    "---------------------\n"
    "Given the context information and not prior knowledge, "
    "answer the question: {query_str}\n"
)


https://docs.llamaindex.ai/en/stable/examples/customization/prompts/chat_prompts/?h=chat+prompts
thx for your answer, is there a wqay to disable that ?
You can follow the link above to customize the prompt
Add a reply
Sign up and join the conversation on Discord