Find answers from the community

Updated 2 years ago

i am getting this answer when I am using

i am getting this answer when I am using llm=ChatOpenAI . Even I indexed my entire data set, it seems it is not added in the context. Any ideas how I can let it answer more accurately?

Answer:The context provided is about ... Therefore, the original answer remains the same.
d
L
6 comments
I am using this prompt

QA_PROMPT_TMPL = ( "We have provided context information below. \n" "---------------------\n" "{context_str}"
Lots of issues like this reported recently... I think gpt-3.5 got updated.

I would also try creating a custom refine prompt as well (which is the prompt that is causing the output you see). The LLM is supposed to repeat back the answer
this is the full prompt

QA_PROMPT_TMPL = ( "We have provided context information below. \n" "---------------------\n" "{context_str}" "\n---------------------\n" "Given this information, please answer the question and provide a list of condominium names, its website and price range: {query_str}\n" index = load_index_from_disk(indexJson) response = query_index(index, query_str, service_context, QA_PROMPT_TMPL)
how do i create a refine prompt? if you can just send the docs, I can just refer to it. πŸ™‚
Add a reply
Sign up and join the conversation on Discord