ChatMessage
. If a tool is called by an agent and a message is added to the memory a .json()
wont work triggering TypeError: Object of type 'ChatCompletionMessageToolCall' is not JSON serializable
I want to write subclass of BaseMemory
backed with redispip install llama-index-storage-chat-store-redis
achat
I got this:β <openai.resources.chat.completions.AsyncCompletions object at 0x168fc0690> File "/Users/jdc/projects/darma/darma_api/.venv/lib/python3.11/site-packages/openai/_base_client.py", line 1536, in post return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls) β β β β β β openai.AsyncStream[openai.types.chat.chat_completion_chunk.ChatCompletionChunk] β β β β β False β β β β FinalRequestOptions(method='post', url='/chat/completions', files=None, json_data={'messages': [{'role': 'system', 'content':... β β β <class 'openai.types.chat.chat_completion.ChatCompletion'> β β <function AsyncAPIClient.request at 0x127ae7880> β <openai.AsyncOpenAI object at 0x168fb6550> File "/Users/jdc/projects/darma/darma_api/.venv/lib/python3.11/site-packages/openai/_base_client.py", line 1315, in request return await self._request( β β <function AsyncAPIClient._request at 0x127ae7920> β <openai.AsyncOpenAI object at 0x168fb6550> File "/Users/jdc/projects/darma/darma_api/.venv/lib/python3.11/site-packages/openai/_base_client.py", line 1392, in _request raise self._make_status_error_from_response(err.response) from None β β <function BaseClient._make_status_error_from_response at 0x127ae5300> β <openai.AsyncOpenAI object at 0x168fb6550> openai.BadRequestError: Error code: 400 - {'error': {'message': "Invalid value for 'content': expected a string, got null.", 'type': 'invalid_request_error', 'param': 'messages.[2].content', 'code': None}}
127.0.0.1:6379> lrange "chat_history_jdc:7ac6d17c-3203-4de9-8856-9b733a8229e1" 0 100 1) "{\"type\": \"user\", \"content\": \"my name is juan diego\"}" 2) "{\"type\": \"assistant\", \"content\": \"I'm here to assist you, Juan Diego. How can I help you today?\"}" 3) "{\"type\": \"user\", \"content\": \"whats my agenda tomorrow?\"}" 4) "{\"type\": \"assistant\", \"content\": null}" 5) "{\"type\": \"tool\", \"content\": \"None\"}"
openai.BadRequestError: Error code: 400 - {'error': {'message': "Invalid parameter: messages with role 'tool' must be a response to a preceeding message with 'tool_calls'.", 'type': 'invalid_request_error', 'param': 'messages.[9].role', 'code': None}}
RedisChatStore
:# Convert a ChatMessage to a json object for Redis def _message_to_dict(message: ChatMessage) -> dict: return {"type": message.role, "content": message.content} # Convert the json object in Redis to a ChatMessage def _dict_to_message(d: dict) -> ChatMessage: return ChatMessage(role=d["type"], content=d["content"])
role
and content
:{ "role":"assistant", "content":"None", "tool_calls":[ { "id":"call_jPkXvQqfbZdboBLd08eWdlUK", "function":{ "arguments":"{\"start_date\":\"2024-03-28\",\"end_date\":\"2024-03-28\"}", "name":"function_name_here" }, "type":"function" } ] },
ChatMessage(role=<MessageRole.ASSISTANT: 'assistant'>, content=None, additional_kwargs={'tool_calls': [ChatCompletionMessageToolCall(id='call_jPkXvQqfbZdboBLd08eWdlUK', function=Function(arguments='{"start_date":"2024-03-28","end_date":"2024-03-28"}', name='function_name_here'), type='function')]})
{ "role":"tool", "content":"....", "name":"function_name_here", "tool_call_id":"call_jPkXvQqfbZdboBLd08eWdlUK" }
add_message
:ChatMessage(role=<MessageRole.TOOL: 'tool'>, content="...", additional_kwargs={'name': 'function_name_here', 'tool_call_id': 'call_jPkXvQqfbZdboBLd08eWdlUK'})
kwargs
but it's getting just thrown away by _message_to_dict
and _dict_to_message