Find answers from the community

Updated 2 months ago

Hi there, what's the recommeded way to

Hi there, what's the recommeded way to serialize ChatMessage . If a tool is called by an agent and a message is added to the memory a .json() wont work triggering TypeError: Object of type 'ChatCompletionMessageToolCall' is not JSON serializable I want to write subclass of BaseMemory backed with redis
L
j
24 comments
there is a redis chat store actually
Otherwise, you may have to serialize the tool call and then call json()
tbh the framework should probably be converting those types to json before creating the message object
@Logan M I think there's a bug when using it with achat I got this:

Plain Text
                 β”” <openai.resources.chat.completions.AsyncCompletions object at 0x168fc0690>
  File "/Users/jdc/projects/darma/darma_api/.venv/lib/python3.11/site-packages/openai/_base_client.py", line 1536, in post
    return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
                 β”‚    β”‚       β”‚        β”‚            β”‚                  β”” openai.AsyncStream[openai.types.chat.chat_completion_chunk.ChatCompletionChunk]
                 β”‚    β”‚       β”‚        β”‚            β”” False
                 β”‚    β”‚       β”‚        β”” FinalRequestOptions(method='post', url='/chat/completions', files=None, json_data={'messages': [{'role': 'system', 'content':...
                 β”‚    β”‚       β”” <class 'openai.types.chat.chat_completion.ChatCompletion'>
                 β”‚    β”” <function AsyncAPIClient.request at 0x127ae7880>
                 β”” <openai.AsyncOpenAI object at 0x168fb6550>
  File "/Users/jdc/projects/darma/darma_api/.venv/lib/python3.11/site-packages/openai/_base_client.py", line 1315, in request
    return await self._request(
                 β”‚    β”” <function AsyncAPIClient._request at 0x127ae7920>
                 β”” <openai.AsyncOpenAI object at 0x168fb6550>
  File "/Users/jdc/projects/darma/darma_api/.venv/lib/python3.11/site-packages/openai/_base_client.py", line 1392, in _request
    raise self._make_status_error_from_response(err.response) from None
          β”‚    β”” <function BaseClient._make_status_error_from_response at 0x127ae5300>
          β”” <openai.AsyncOpenAI object at 0x168fb6550>

openai.BadRequestError: Error code: 400 - {'error': {'message': "Invalid value for 'content': expected a string, got null.", 'type': 'invalid_request_error', 'param': 'messages.[2].content', 'code': None}}
A message got sent to openai that had no content
I wonder what the issue was there πŸ€”
@Logan M I guess this helps:


Plain Text
127.0.0.1:6379> lrange "chat_history_jdc:7ac6d17c-3203-4de9-8856-9b733a8229e1" 0 100
1) "{\"type\": \"user\", \"content\": \"my name is juan diego\"}"
2) "{\"type\": \"assistant\", \"content\": \"I'm here to assist you, Juan Diego. How can I help you today?\"}"
3) "{\"type\": \"user\", \"content\": \"whats my agenda tomorrow?\"}"
4) "{\"type\": \"assistant\", \"content\": null}"
5) "{\"type\": \"tool\", \"content\": \"None\"}"
it all goes well until the tool call
tried changing the content null to something else, but it's a deeper problem
openai.BadRequestError: Error code: 400 - {'error': {'message': "Invalid parameter: messages with role 'tool' must be a response to a preceeding message with 'tool_calls'.", 'type': 'invalid_request_error', 'param': 'messages.[9].role', 'code': None}}
the format of what's saved and fedback is just wrong
I will have to take a note to fix this. Unless you want to dive in lol
@Logan M do you have the issue, couldnt find it ? I had some time to look further and might add at least some details to the issue.
so part of the problem are these functions in RedisChatStore:

Plain Text
# Convert a ChatMessage to a json object for Redis
def _message_to_dict(message: ChatMessage) -> dict:
    return {"type": message.role, "content": message.content}


# Convert the json object in Redis to a ChatMessage
def _dict_to_message(d: dict) -> ChatMessage:
    return ChatMessage(role=d["type"], content=d["content"])
OpenAIs message for tools require more fields, to be added to the response other than role and content:

Plain Text
{
            "role":"assistant",
            "content":"None",
            "tool_calls":[
               {
                  "id":"call_jPkXvQqfbZdboBLd08eWdlUK",
                  "function":{
                     "arguments":"{\"start_date\":\"2024-03-28\",\"end_date\":\"2024-03-28\"}",
                     "name":"function_name_here"
                  },
                  "type":"function"
               }
            ]
         },
proper data comes into add_message in RedisChatStore
i.e:

Plain Text
ChatMessage(role=<MessageRole.ASSISTANT: 'assistant'>, content=None, additional_kwargs={'tool_calls': [ChatCompletionMessageToolCall(id='call_jPkXvQqfbZdboBLd08eWdlUK', function=Function(arguments='{"start_date":"2024-03-28","end_date":"2024-03-28"}', name='function_name_here'), type='function')]})
same for function responses, OpenAI requires:

Plain Text
{
            "role":"tool",
            "content":"....",
            "name":"function_name_here",
            "tool_call_id":"call_jPkXvQqfbZdboBLd08eWdlUK"
         }
in add_message:

Plain Text
ChatMessage(role=<MessageRole.TOOL: 'tool'>, content="...", additional_kwargs={'name': 'function_name_here', 'tool_call_id': 'call_jPkXvQqfbZdboBLd08eWdlUK'})
so we have everything needed in the kwargs but it's getting just thrown away by _message_to_dict and _dict_to_message
@jdcaballerov Actually I think someone just fixed this (I had thought this was you lol but maybe not)
https://github.com/run-llama/llama_index/pull/12309
lol, i was almost done :d
Add a reply
Sign up and join the conversation on Discord