Find answers from the community

Updated 3 months ago

REFINE_PROMPT_TMPL = (

REFINE_PROMPT_TMPL = (
"原始查询如下: {query_str}\n"
"我们已经提供了一个现有的答案: {existing_answer}\n"
"我们有机会用下面的更多上下文来改进现有的答案(仅在需要时)。\n"
"------------\n"
"{context_msg}\n"
"------------\n"
"给定新的上下文,改进原始答案以更好地回答查询。"
"如果上下文没有用,则返回原始答案。\n"
"改进后的答案: "
)
REFINE_PROMPT = PromptTemplate(REFINE_PROMPT_TMPL)

query_engine = index.as_query_engine(
sub_retrievers=[
llm_synonym,
vector_context,
],
llm=CustomLLM(),
refine_template=REFINE_PROMPT
)


In the above code, refine_template is not in effect, how can I fix this? Can someone take a look at this for me? Thanks
W
f
2 comments
if it is not taking effect, you can first check which prompt templates are being used and then you can update the prompt this way also:
https://docs.llamaindex.ai/en/latest/module_guides/models/prompts/usage_pattern/?h=modify+prompt#accessing-prompts

see if this works for you!
it works, thank you 👍
Add a reply
Sign up and join the conversation on Discord