Hi @Logan M I created the agent as follows:
from langchain.chat_models import ChatOpenAI
from llama_index import ServiceContext
memory = ConversationBufferMemory(memory_key="chat_history", ai_prefix=system_message)
llm=ChatOpenAI(temperature=0, model_name="gpt-4")
agent_chain = create_llama_chat_agent(
toolkit,
llm,
memory=memory,
verbose=True,
agent_kwargs={"prefix": system_message})
Unfortunately when the agent uses the tool llama-index, it doesn't get the system_message, should I personalize the prompt templates for each of the index? Thanks in advance for your help.