All the examples I see with llama-index agents, they are using OpenAI models, however, I would like to use models like "llama2" instead. But if I put llm="llama2" in the highlighted part of the example code below, it throws as error and doesn't work with it. Does anyone knows how to use other models with agents?
Does LlamaIndex have agents of itself or does it only support agents outisde ot llamaindex (OpenAI Function agent, ReAct agent, and LLMCompiler Agent)?
I was looking to use an agent that is developed by llamaindex or any other agent that is local on my computer that I won't be sharing any information with the outside (like OpenAI). Could someone kindly guide me to the right direction?
Does anyone know how to make a python function into a tool? Similar to Assistants of OpenAI where you can turn any funtion into a tool and then the assistant intellignetly uses that tool, how can I do that with llamaIndex (like perhpas data agents) without using OpenAI?
I am using an open source llm with an agent all through llamaindex. it says the agent is using the tools, however, it's actullly not using it and at the end of each response from (agent.stream_chat) it says Agent: None.
Anyone knows what might be going on and why it's not using the tool and why Agent: None.?