Find answers from the community

Updated 2 years ago

System context

When we are using GPT 3.5 for llm, how to setup system context? Any suggestions to including previous query and respond as short term memory?
L
a
5 comments
You can insert system context and chat history by dynamically creating a prompt template

Need to be careful about chat history length though. Another option is langchain agents with llama index as a custom tool
Here's a quick example I have setting up the refine template. You'll want to copy the default text_qa_template and do something similar and pass it in as well (both should be set)

Plain Text
from langchain.prompts.chat import (
    AIMessagePromptTemplate,
    ChatPromptTemplate,
    HumanMessagePromptTemplate,
)

from llama_index.prompts.prompts import RefinePrompt

# Refine Prompt
CHAT_REFINE_PROMPT_TMPL_MSGS = [
    HumanMessagePromptTemplate.from_template("{query_str}"),
    AIMessagePromptTemplate.from_template("{existing_answer}"),
    HumanMessagePromptTemplate.from_template(
        "I have more context below which can be used "
        "(only if needed) to update your previous answer.\n"
        "------------\n"
        "{context_msg}\n"
        "------------\n"
        "Given the new context, update the previous answer to better "
        "answer my previous query."
        "If the previous answer remains the same, repeat it verbatim. "
        "Never reference the new context or my previous query directly.",
    ),
]


CHAT_REFINE_PROMPT_LC = ChatPromptTemplate.from_messages(CHAT_REFINE_PROMPT_TMPL_MSGS)
CHAT_REFINE_PROMPT = RefinePrompt.from_langchain_prompt(CHAT_REFINE_PROMPT_LC)
...
query_engine = index.as_query_engine(..., refine_template=CHAT_REFINE_PROMPT)
@Logan M can u show me an example to customise system context under gpt3.5 first?
Same thing, but import SystemMessagePrompt from langchain πŸ‘
Noted. let me dive into langChain πŸ˜€
Add a reply
Sign up and join the conversation on Discord