Find answers from the community

Updated 2 years ago

System context

At a glance

The post asks how to set up system context and include previous queries as short-term memory when using GPT-3.5 for large language models (LLMs). The comments suggest two approaches:

1. Dynamically creating a prompt template to insert system context and chat history, while being cautious about the length of the chat history.

2. Using LangChain agents with LlamaIndex as a custom tool to handle system context and chat history.

One community member provides a code example of setting up a refine prompt template in LangChain, which includes handling the previous answer and new context to update the response.

Another community member requests an example of customizing the system context under GPT-3.5, and the response suggests using the SystemMessagePrompt from LangChain.

When we are using GPT 3.5 for llm, how to setup system context? Any suggestions to including previous query and respond as short term memory?
L
a
5 comments
You can insert system context and chat history by dynamically creating a prompt template

Need to be careful about chat history length though. Another option is langchain agents with llama index as a custom tool
Here's a quick example I have setting up the refine template. You'll want to copy the default text_qa_template and do something similar and pass it in as well (both should be set)

Plain Text
from langchain.prompts.chat import (
    AIMessagePromptTemplate,
    ChatPromptTemplate,
    HumanMessagePromptTemplate,
)

from llama_index.prompts.prompts import RefinePrompt

# Refine Prompt
CHAT_REFINE_PROMPT_TMPL_MSGS = [
    HumanMessagePromptTemplate.from_template("{query_str}"),
    AIMessagePromptTemplate.from_template("{existing_answer}"),
    HumanMessagePromptTemplate.from_template(
        "I have more context below which can be used "
        "(only if needed) to update your previous answer.\n"
        "------------\n"
        "{context_msg}\n"
        "------------\n"
        "Given the new context, update the previous answer to better "
        "answer my previous query."
        "If the previous answer remains the same, repeat it verbatim. "
        "Never reference the new context or my previous query directly.",
    ),
]


CHAT_REFINE_PROMPT_LC = ChatPromptTemplate.from_messages(CHAT_REFINE_PROMPT_TMPL_MSGS)
CHAT_REFINE_PROMPT = RefinePrompt.from_langchain_prompt(CHAT_REFINE_PROMPT_LC)
...
query_engine = index.as_query_engine(..., refine_template=CHAT_REFINE_PROMPT)
@Logan M can u show me an example to customise system context under gpt3.5 first?
Same thing, but import SystemMessagePrompt from langchain πŸ‘
Noted. let me dive into langChain πŸ˜€
Add a reply
Sign up and join the conversation on Discord