Log in
Log into community
Find answers from the community
View all posts
Related posts
Did this answer your question?
😞
😐
😃
Powered by
Hall
Inactive
Updated 5 months ago
0
Follow
Is it possible to get OpenAI finish_
Is it possible to get OpenAI finish_
Inactive
0
Follow
j
johnch
5 months ago
·
Is it possible to get OpenAI finish_reason from the LlamaIndex Response?
L
j
3 comments
Share
Open in Discord
L
Logan M
5 months ago
probably only by using a custom instrumenation event handler, to get the raw LLM response
j
johnch
5 months ago
Can you link to any docs or examples on how I might start to do that? Or perhaps just to the relevant code I’d need to be overwriting?
L
Logan M
5 months ago
This is an exhaustive example, but
https://docs.llamaindex.ai/en/stable/examples/instrumentation/instrumentation_observability_rundown/
Probably youd want the llm chat/llm completion end events. The raw response is on response.raw I think
Add a reply
Sign up and join the conversation on Discord
Join on Discord