Find answers from the community

Updated 2 months ago

gents, does the OpenAIAgent (the one

gents, does the OpenAIAgent (the one referenced here https://docs.llamaindex.ai/en/stable/examples/agent/openai_agent/) keep conversation history? it doesn't seem to do so. What's the best way to go about it?
L
R
c
43 comments
agent.memory.get() will get the current buffer
Plain Text
>>> from llama_index.agent.openai import OpenAIAgent
>>> agent = OpenAIAgent.from_tools([])
>>> resp = agent.chat("My name is logan")
>>> print(str(resp))
Nice to meet you, Logan! How can I assist you today?
>>> resp = agent.chat("I forgot my name, what was it?")
>>> print(str(resp))
Your name is Logan. Is there anything else I can help you with?
>>> 
hey Logan. Tahnk you for the answer.
It seems the issue is not caused by OpenAIAgent itself (I initially thought it was).
The real issue has to do with streamlit's str.chatinput function. It's somewhat stateless ? and seems to cause conversation memory to get lost . so i'm trying to work around that now.
also this related thread:
https://discord.com/channels/1059199217496772688/1240358158116061195
You probably need to put the agent into the session_state of the app.

Either that, or manually keep track of and insert the chat history on each .chat() call to override anything in the memory

Plain Text
chat_history = agent.chat_history
agent.chat("Hello!", chat_history=chat_history)
let me try it out.
didn't work but i had another turnaround work here. let me just clean the code and share the snippet of what worked.
again more a streamlit issue than llamaindex
ok so what works for me is this
Plain Text
@st.cache_resource
def initialize_agent():
    agent = get_agent()
    return agent

agent = initialize_agent()
in the same file where i am using the streamlit chat widget
get_agent simply grabs the OpenAIAPI agent that is being defined in a separate python help file.
originally while looking for a solution tracked this thread on langchain github
but i didn't want to use their component, .... StreamlitChatMessageHistory
then i thought that maybe finding a way to initialize the agent only once, and not have it reset everytime the page runs could solve it, so i found this thread: https://discuss.streamlit.io/t/is-there-a-way-to-run-an-initialization-function/61154/3
which took me here:
using the @st.cache_resource decorator seems to solve my problem
took me a whole day 😐
thank god we don't use streamlit in production 🀣
quick question though @Logan M accordint to this page, the OpenAIAgent should be able to do streaming...

https://docs.llamaindex.ai/en/stable/examples/agent/openai_agent/#agent-with-personality
yea, agent.stream_chat()
however whenever i do agent.stream_chat() it's throwing an error
Whats the error?
streaming works with other Agent Types like ReacTagent, and the agentrunner / agentworker combo fine.... only errors on OpenAIAgent
Plain Text
AttributeError: 'StreamingAgentChatResponse' object has no attribute '_is_function_not_none_thread_event'
Plain Text
response = agent.stream_chat(prompt)
llama-index 0.10.37
llama-index-agent-openai 0.2.2
llama-index-llms-openai 0.1.16
but can't figure out the connection πŸ˜„
Hmm, this should be just a versioning thing. Try pip install -U llama-index-agent-openai llama-index-core llama-index-llms-openai just to be sure? (You might have to restart anything you have running too)
Plain Text
agent = OpenAIAgent.from_tools([vector_query_tool, *googletools],llm=llm,verbose=True,system_prompt=get_system_prompt())

That's how agent is defined
oh let me try that
moment of truth πŸ₯
yes it was a versioning issue.
Works perfectly with upgraded versions:
llama-index-agent-openai-0.2.5 llama-index-llms-openai-0.1.19
thanks from the future!
Add a reply
Sign up and join the conversation on Discord