Find answers from the community

L
Lucas
Offline, last seen 3 months ago
Joined September 25, 2024
is there way to use a remote LLM but have anything run locally?
1 comment
W
anyway I can get the agents thoughts so I can log them?
3 comments
L
L
is there a way to get rid of max iterations I tried setting it to large number and setting it to false still nothing also had a system prompt?
12 comments
L
L
Is there a way to give the llm a description of the tools and how to use them?
1 comment
W
is there a way to get this working with the client library instead?
5 comments
L
W
L
from llama_index.llms import Ollama from llama_index.agent import ReActAgent from llama_index.tools import FunctionTool from datetime import date def add_numbers(a : int, b: int) -> int: """Adds two numbers and returns the result""" return a+b def get_current_date() -> date: """returns the current date""" return date.today() tools = [ FunctionTool.from_defaults(fn=add_numbers), FunctionTool.from_defaults(fn=get_current_date) ] llm = Ollama(model="mistral") agent = ReActAgent.from_tools(tools, llm=llm, verbose=True) response = agent.chat("what is today's date?")

I found this code online and am trying to run it

I tried pip install llama_index but am getting

from llama_index.agent import ReActAgent
ImportError: cannot import name 'ReActAgent' from 'llama_index.agent' (unknown location)

Anyone know if the imports changed?
11 comments
L
L
I think the problem maybe in this line where can i delete it?
2 comments
L
L
When I run this locally it works fine. I made a custom LLM with a remote API running on my own server and it loops like this anyone else see this error?
6 comments
L
L