pinecone.core.openapi.shared.exceptions.NotFoundException: (404) Reason: Not Found HTTP response headers: HTTPHeaderDict({'Date': 'Tue, 15 Oct 2024 15:46:11 GMT', 'Content-Type': 'application/json', 'Content-Length': '55', 'Connection': 'keep-alive', 'x-pinecone-request-latency-ms': '41', 'x-pinecone-request-id': '6886694976858348241', 'x-envoy-upstream-service-time': '42', 'server': 'envoy'}) HTTP response body: {"code":5,"message":"Namespace not found","details":[]}
UpstashChatStore( redis_url=os.environ.get("UPSTASH_REDIS_URL"), redis_token=os.environ.get("UPSTASH_REDIS_TOKEN"), ttl=300, # Optional: Time to live in seconds )
'UpstashChatStore' object has no attribute '__pydantic_private__'
>>> composable_memory.primary_memory.get("my cpa") [ChatMessage(role=<MessageRole.SYSTEM: 'system'>, content='You are a helpful marketing assistant', additional_kwargs={}), ChatMessage(role=<MessageRole.USER: 'user'>, content='my cpa is $350', additional_kwargs={}), ChatMessage(role=<MessageRole.ASSISTANT: 'assistant'>, content='Are you looking for ways to reduce your CPA, or do you need help with analyzing or understanding this metric further?', additional_kwargs={})] >>> composable_memory.secondary_memory_sources[0].get("what is my cpa") [ChatMessage(role=<MessageRole.USER: 'user'>, content='my cpa is $350', additional_kwargs={}), ChatMessage(role=<MessageRole.ASSISTANT: 'assistant'>, content='Are you looking for ways to reduce your CPA, or do you need help with analyzing or understanding this metric further?', additional_kwargs={})]
llama-index-llms-portkey
packagePydanticUserError: `Portkey` is not fully defined; you should define `Modes`, then call `Portkey.model_rebuild()`. For further information visit https://errors.pydantic.dev/2.9/u/class-not-fully-defined
tools_metadata = [ { "query_engine": self.vector_query_engine, "metadata": ToolMetadata( name="user_uploaded_documents", description="use this tool when asked about specific documents uploaded by the user. Do not worry if the file does not mention the name of the file.", ) }, { "fn": generate_images, "description": ( "generate_images(prompt: str, n: int) -> str\n\nOnly use this tool to create images or pictures upon user request. " "Useful for generating images" ), "fn_schema": DalleSchemaModel }, { "fn": search_with_bing, "description": ( "search_function(query: str) -> str\n\nUse this tool to retrieve real-time and up-to-date information to best answer a user query. " "This includes, but is not limited to, topics such as current events, weather updates, stock market data, and any other information that is subject to frequent changes" ), "fn_schema": BingSearchModel }, ] for tool in tools_metadata: if "query_engine" in tool: tools.append(QueryEngineTool(query_engine=tool["query_engine"], metadata=tool["metadata"])) else: tools.append(FunctionTool.from_defaults(fn=tool["fn"], description=tool["description"], fn_schema=tool["fn_schema"])) tool_mapping = SimpleToolNodeMapping.from_objects(tools) tool_index = ObjectIndex.from_objects( tools, tool_mapping, VectorStoreIndex, ) tool_retriever = tool_index.as_retriever(similarity_top_k=1) picked_tool = tool_retriever.retrieve(query)[0]
context_str
?agent.chat_store = redis_chat_store
ValueError: Could not parse output
raised by reasoning_step
by the ReActAgent?ReAct Agent
, what is the difference between .query and .chat methods?/agent/react/step.py
for this message_content
: Thought: blah blah.
message_content
contains Thought:
?