Find answers from the community

Updated 2 years ago

https gpt index readthedocs io en latest

At a glance

The community member is reading a document about the OpenAIAgent implementation and wants to know how to combine the agent with index context and their own chat history. Another community member suggests a related document that may be helpful. The community member confirms that the index context part is solved, but they still want to include their own chat history, similar to the OpenAI API. A solution is provided, where the community member can pass the chat history as an optional kwarg when calling the chat() method, which will override the memory of the current chat session.

Useful resources
https://gpt-index.readthedocs.io/en/latest/examples/agent/openai_agent.html#our-slightly-better-openaiagent-implementation
Hey guys, in this document I'm reading, how can I combine the agent with index context and my chat history?
L
T
5 comments
I think you possibly mean this?

https://gpt-index.readthedocs.io/en/latest/examples/agent/openai_agent_context_retrieval.html

If not, can you explain what you are looking for?
thank you for the quick reply, that does solve the index context part.However, I want to enter my own chat history too, like openai api's
Attachment
image.png
You can provide the chat history as an optional kwarg when you call chat

Plain Text
from llama_index.llms import ChatMessage

agent.chat("Hello!", chat_history=[ChatMessage(content="Never say hello", role="system")])
This kwarg will override the memory of the current chat session
thanks alot, I'll try it out!
Add a reply
Sign up and join the conversation on Discord