Does LlamaIndex have agents of itself or does it only support agents outisde ot llamaindex (OpenAI Function agent, ReAct agent, and LLMCompiler Agent)?
I was looking to use an agent that is developed by llamaindex or any other agent that is local on my computer that I won't be sharing any information with the outside (like OpenAI). Could someone kindly guide me to the right direction?
oh haha, sorry I thought the react one is from langchain and the openai is connected to openai stuff lol. I am using local llms, so if I also want to have my agent being local (as I' handling private data), then react agent is the one to go based on what you mentioned?
Sorry for another question: would you happen to know or have any better ideas if "most open source LLMs kinda suck at being an agent". For data privacy reason, I can't use openai models, so any thought for a work around agants?
Thank you for your response. I had a look at it, but I think that's still on microsoft servers, which I don't think is very private. Think of it as working with patient data, where privacy really matters. But I'm also scared if working with local llms will not work as well as using gpt models of openai.
What do you think? I'd be grateful to hear your thoughts and opinions on that. Thank you.
Local models work well for basic q and a. Less so for agentic tasks, or tasks that require structured outputs. In my experience anyways, that's the current state
Azure has privacy agreements for enterprise, etc. It's quite secure, but would take some wrangling to get people like healthcare on board I'd think