Find answers from the community

Updated 5 months ago

Is it possible to get OpenAI finish_

Is it possible to get OpenAI finish_reason from the LlamaIndex Response?
L
j
3 comments
probably only by using a custom instrumenation event handler, to get the raw LLM response
Can you link to any docs or examples on how I might start to do that? Or perhaps just to the relevant code I’d need to be overwriting?
This is an exhaustive example, but
https://docs.llamaindex.ai/en/stable/examples/instrumentation/instrumentation_observability_rundown/

Probably youd want the llm chat/llm completion end events. The raw response is on response.raw I think
Add a reply
Sign up and join the conversation on Discord