Find answers from the community

Updated 2 months ago

Prepend messages

Think this is deprecated? The QA template does work I think but that just uses a regular prompt right?
L
C
T
28 comments
@CrisTian @Teemu ah, I guess the old way of doing it got removed then. Big sad.

You can setup the text_qa_template and refine_template to use messages ez pz though, I can make an example if you want
nice .... thanks ....
Hmm yeah that would be helpful, I'm just wondering if the system prompt is used in that?
and where i define that ??? (text_qa_template)
ammmmm its must be defined in the query time ....
mmmmm interesting
hahuauhauhahuuhauha i was reading the same !!!!! jajajaja nice
Plain Text
from llama_index.prompts.prompts import QuestionAnswerPrompt, RefinePrompt
from langchain.prompts.chat import (
    AIMessagePromptTemplate,
    ChatPromptTemplate,
    HumanMessagePromptTemplate,
    SystemMessagePromptTemplate
)

SYSTEM_PROMPT = SystemMessagePromptTemplate.from_template("Every response should be written like you are a pirate.")

CHAT_REFINE_PROMPT_TMPL_MSGS = [
    SYSTEM_PROMPT,
    HumanMessagePromptTemplate.from_template("{query_str}"),
    AIMessagePromptTemplate.from_template("{existing_answer}"),
    HumanMessagePromptTemplate.from_template(
        "We have the opportunity to refine the above answer "
        "(only if needed) with some more context below.\n"
        "------------\n"
        "{context_msg}\n"
        "------------\n"
        "Given the new context, refine the original answer to better "
        "answer the question. "
        "If the context isn't useful, output the original answer again.",
    ),
]


CHAT_REFINE_PROMPT_LC = ChatPromptTemplate.from_messages(CHAT_REFINE_PROMPT_TMPL_MSGS)
CHAT_REFINE_PROMPT = RefinePrompt.from_langchain_prompt(CHAT_REFINE_PROMPT_LC)

CHAT_QA_PROMPT_TMPL_MSGS = [
    SYSTEM_PROMPT,
    HumanMessagePromptTemplate.from_template(
         "Context information is below. \n"
        "---------------------\n"
        "{context_str}"
        "\n---------------------\n"
        "Given the context information and not prior knowledge, "
        "answer the question: {query_str}\n"
    )
]
CHAT_QA_PROMPT_LC = ChatPromptTemplate.from_messages(CHAT_QA_PROMPT_TMPL_MSGS)
CHAT_QA_PROMPT = QuestionAnswerPrompt.from_langchain_prompt(CHAT_QA_PROMPT_LC)

...

query_engine = index.as_query_engine(text_qa_template=CHAT_QA_PROMPT, refine_template=CHAT_REFINE_PROMPT)
There's my example, sorry, it's a little long haha
It uses the default prompts from llama index, but adds the system prompt to each
Then at the end, creates the query engine with them
ups ... huauhahuahua ok .... wait me a moment to implement and test
You're a legend πŸ’ͺ
haha thanks! I really wouldn't expect anyone to know how to do this without reading the codebase... hoping to simplify the prompts in the future!
Yeah I try to read it but very busy atm so hard to keep up with the changes!
You and me both :dotsCATJAM:πŸ˜‚
ojjojjojojoj same here !!!
Yeah a lot of work in this space, it's a race everyday to keep up!
You miss a few days and everythings already changed
@jma7889 you might be interested in this thread. The text_qa_template and refine_template are the two main ones to change
1 fast question .....
what is """ in python ???
i can not concat a variable to that πŸ˜›
never main ... jojojooj i found a solution πŸ™‚
@Logan M are you there ?
the example that you made is great .... but actually not work in my case ... i use langchain ... with a chain called "create_llama_chat_agent" ... and not accept the parameters .... so i was searching some alternatives .... and in one time it works ... but then crushed ...
Add a reply
Sign up and join the conversation on Discord