Find answers from the community

t
tomi
Offline, last seen 3 months ago
Joined September 25, 2024
ValueError: wrapper has not been initialized
29 comments
k
t
i'm getting this error on a ReACT agent:

Plain Text
Observation: Error: 'function' object has no attribute 'metadata'
15 comments
k
t
how do i debug the ReAct agent? it's not creating the correct action
7 comments
k
t
how do i give a reactagent a system prompt
12 comments
k
t
how do i add a system prompt to an instance of ReactAgentWorker
2 comments
k
is there any class that uses agents as tools?
5 comments
k
t
t
tomi
·

@Logan M

1 comment
L
when I create an instance of FunctionCallingAgentWorker with some tools, is there a way to add more tools after creating this instance? Can i manipulate tool_retriever somehow?
11 comments
k
t
How do I create my own llamapack or llamahub tool? Are they the same thing?
2 comments
a
is anyone else gettign thsi error:

Plain Text
...
_get_async_stream_ai_response
    await chat_stream_response._is_function_false_event.wait()
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
AttributeError: 'StreamingAgentChatResponse' object has no attribute '_is_function_false_event'. Did you mean: 'is_function_false_event'?
INFO:httpx:HTTP Request: POST https://api.openai.com/v1/chat/completions "HTTP/1.1 200 OK"
HTTP Request: POST https://api.openai.com/v1/chat/completions "HTTP/1.1 200 OK"
22 comments
t
L
t
tomi
·

Update

@Logan M i keep running into this:

Plain Text
TypeError: Can't instantiate abstract class Anthropic with abstract method _prepare_chat_with_tools


Plain Text
TypeError: Can't instantiate abstract class MistralAI with abstract method _prepare_chat_with_tools


It's strange because i'm importing them correctly:

Plain Text
from llama_index.llms.anthropic import Anthropic
from llama_index.llms.mistralai import MistralAI
22 comments
t
L
t
tomi
·

Trace

@Logan M , is this new:

Plain Text
Traceback (most recent call last):
  File "/app/main.py", line 1, in <module>
    from llama_index.agent.openai import OpenAIAgent
  File "/usr/local/lib/python3.12/site-packages/llama_index/agent/openai/__init__.py", line 1, in <module>
    from llama_index.agent.openai.base import OpenAIAgent
  File "/usr/local/lib/python3.12/site-packages/llama_index/agent/openai/base.py", line 20, in <module>
    from llama_index.agent.openai.step import OpenAIAgentWorker
  ...
  File "/usr/local/lib/python3.12/site-packages/llama_index/core/instrumentation/__init__.py", line 1, in <module>
    from llama_index.core.instrumentation.dispatcher import Dispatcher, Manager
  File "/usr/local/lib/python3.12/site-packages/llama_index/core/instrumentation/dispatcher.py", line 24, in <module>
    class Dispatcher(BaseModel):
  File "/usr/local/lib/python3.12/site-packages/pydantic/v1/main.py", line 286, in __new__
    cls.__try_update_forward_refs__()
  File "/usr/local/lib/python3.12/site-packages/pydantic/v1/main.py", line 807, in __try_update_forward_refs__
    update_model_forward_refs(cls, cls.__fields__.values(), cls.__config__.json_encoders, localns, (NameError,))
  File "/usr/local/lib/python3.12/site-packages/pydantic/v1/typing.py", line 554, in update_model_forward_refs
    update_field_forward_refs(f, globalns=globalns, localns=localns)
  File "/usr/local/lib/python3.12/site-packages/pydantic/v1/typing.py", line 520, in update_field_forward_refs
    field.type_ = evaluate_forwardref(field.type_, globalns, localns or None)
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/pydantic/v1/typing.py", line 66, in evaluate_forwardref
    return cast(Any, type_)._evaluate(globalns, localns, set())
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: ForwardRef._evaluate() missing 1 required keyword-only argument: 'recursive_guard'
10 comments
L
t
t
tomi
·

Llm

Hey @Logan M, I'm getting:
Plain Text
E   ModuleNotFoundError: No module named 'llama_index.llms.anthropic'
2 comments
W
t
tomi
·

Learn

Is it possible to create an agent that learns from it's responses?
1 comment
L
i'm following the new blog post on how to deploy a create-llama app and there's an issue where the python fastapi server just hangs. i've narrowed it down to here:

Plain Text
    chat_engine = index.as_chat_engine()
    print(f"sending")
    response = chat_engine.stream_chat(lastMessage.content, messages)
    print("received")


"received" never gets printed, only "sending"
16 comments
S
t