Find answers from the community

Updated 4 months ago

Why am I getting AttributeError str

At a glance

The community member is encountering an AttributeError: 'str' object has no attribute 'chat' when using the agent.chat() method. They have verified that the agent object is of type llama_index.agent.react.base.ReActAgent. The community members provide the full traceback, which indicates the issue is with the self._llm attribute. After further investigation, the community member resolves the issue by setting up the LLM for the agent as described in the documentation, using llama_index.llms import OpenAI and llm = OpenAI(model="gpt-3.5-turbo-0613").

Useful resources
Why am I getting AttributeError: 'str' object has no attribute 'chat' after agent.chat("A question") ? If I do print(type(agent)) I get 'llama_index.agent.react.base.ReActAgent' . I am following this https://gpt-index.readthedocs.io/en/latest/core_modules/agent_modules/agents/usage_pattern.html#query-engine-tools (with my data).
L
e
n
6 comments
What's the full traceback? That will be more helpful to debug
@elmegatan26 hmm wrong thread I think 😉
Hi, hope this helps: Traceback (most recent call last):
File "/Users/nana/llama_mj/queryenginetool.py", line 54, in <module>
response = agent.chat("¿Cuál es el nombre del presidente de los Estados Unidos?")
File "/Users/nana/venv/lib/python3.10/site-packages/llama_index/agent/react/base.py", line 157, in chat
chat_response = self._llm.chat(input_chat)
AttributeError: 'str' object has no attribute 'chat'
There's the real issue, with the self._llm attribute

How did you setup the LLM for your agent?
Upon revision after your response, I noticed I hadn't set it up like in the docs. Works after modifying with llama_index.llms import OpenAI
llm = OpenAI(model="gpt-3.5-turbo-0613"). Thank you for your help!
Add a reply
Sign up and join the conversation on Discord