Anyone knows why instantiating the SAME Bedrock llm agent in gradio works great and is able to call the tools, but when using the agent with the fastapi app provided by create-llama command line tool it is not able to call the tools, it just allucinates. It only works with OpenAI, but when switching to claude (for example) it is not able to use the tools. The Bedrock agent is able to find the tools but it is not able to receive the results when outputing the Observation, its just hallucinations. Its weird because in the gradio app it just works.
Yes, if you choose fastapi backend and no data and go with the "just a simple chatbot or agent" it creates an agent ( AgentRunner instance) with a choose of tools like Wikipedia and others, I tried bringing a Bedrock agent code (identical) to the create-llama app and it always fails when using the tools