Find answers from the community

Updated 2 months ago

Is it possible to consume values of

Is it possible to consume values of raw and additional_kwargs when defining a custom LLM complete method? I'd like to get back additional information besides the text from the full response when running query() with my custom LLM class. Thanks!

Plain Text
@llm_completion_callback()
    def complete(self, prompt: str, **kwargs: Any) -> CompletionResponse:
        return CompletionResponse(text="MY RESPONSE", raw=full_reponse, additional_kwargs=full_response)

...

response = query_engine.query(query_str)
# Can't seem to access raw or additional_kwargs in the response...
print(response.raw) # Errors
print(response.additional_kwargs) # Also errors
r
r
L
13 comments
i just found out about response.source_nodes today, which might help.

it's not documented on https://docs.llamaindex.ai/en/stable/api_reference/response.html#llama_index.response.schema.Response , which only lists get_formatted_sources
Yes, that’s interesting but I need to track information not about the sources but my LLM’s generation ID in addition to the text so that users can provide feedback on the generation.
Any ideas @Logan M ?
You can implement a custom callback manager that looks for LLM events
Interesting! @Logan M Is the callback class able to access the values of raw or additional_kwargs that are returned as part of the CompletionResponse in the complete method of the LLM class? I'm not quite sure how the LLM class and the callback class are connected.
I need to capture information from the LLM response in a variable in the context of my query, not print the responses like in the example.
It can access that, it'll make the most sense if you use a debugger to see what the llm event is logging
And to make it accessible after querying should I add methods to my callback class to store and retrieve the information I need?
Yea exactly, you could write the callback to put the data wherever you need 👍
OK, thank you so much for the guidance!
Here's some sample code for anyone else who needs to use a custom callback handler to store and retrieve additional_kwargs from a custom LLM class.
And here's how to make additional_kwargs available to the callback handler in your custom LLM class:
Plain Text
@llm_completion_callback()
    def complete(self, prompt: str, **kwargs: Any) -> CompletionResponse:
        response = self._call(prompt)
        return CompletionResponse(
            text=response["text"],
            additional_kwargs={"generation_id": response["generation_id"]},
            raw=response,
        )
Add a reply
Sign up and join the conversation on Discord