Find answers from the community

Updated last year

Hey

Hey!

My React chat engine does correct observation, but in its final response it doesn't mention it. So as a chat user it becomes weird.

Is there a way to fix it? (Or maybe I'm doing smth wrong?)
Attachment
Screenshot_2023-07-24_at_8.25.03_PM.png
L
T
n
6 comments
huh, that's super odd. Not totally sure what would cause that

Are you using openai? You might get better results using an OpenAI agent rather than a react agent.
Making a ticket for this though
Thanks for the response!

No, I have gpt3.5-turbo model for llm_predictor, but don't have an agent. Will go for docs to setup the OpenAI agent, thanks for advise.

Thanks again!
Thanks, appreciate it. LlamaIndex is awesome and learning is fun!
Hi! Jumping in to say that I had the same problem while using the react agent. Per recommendation, using the OpenAI agent worked as expected.
Attachment
image.png
Add a reply
Sign up and join the conversation on Discord