Find answers from the community

Updated 3 weeks ago

Events

I want to modify this tutorial to collect event for each request independently.
And achieve independent event streaming for each request.
https://github.com/rsrohan99/rag-stream-intermediate-events-tutorial/tree/b5062c31d0c4a9cf619f673de84967f2f7c12e35

Based on my understanding, a eventHandler of dispatcher will be applied to all of the request so when requests processed parallelly, events will be returned to the wrong user.
How can I distinct event per user?

Events that I want to utilize:
https://docs.llamaindex.ai/en/stable/api_reference/instrumentation/event_types/
L
d
9 comments
You can use the tagging mechanism to attach tags to local events

Plain Text
from llama_index.core.instrumentation.dispatcher import instrument_tags

with instrument_tags({"user": "me"}):
  <code in this block will emit events with this tag>
Thank you very much. That solution will be one of the solution.

Is there any other way that would be easier to implement, since it is a bit difficult to identify the user's event from the tag and return it in the stream?

I wanted to implement a dispatcher in the request scope as follows, but it does not work


Plain Text
event_q = Queue()
event_handler = RequestEventHandler(event_q)
dispatcher = get_dispatcher()    dispatcher.add_event_handler(event_handler)

@Logan M
I'm not sure what modules you are using, but it sounds like there is an easier way to accomplish what you are doing

If you are just trying to attach data to a response from a fastapi endpoint, it should be much easier πŸ˜…
Tbh the instrumentation stuff isn't really meant to be user facing, more so for people like arize to hook into.

In most cases, it's probably easier to break whatever you are doing into more low level steps and log info as needed
I see.

My implementation is mostly same with this

Only the following changes are introduced
  • CustomEventHandler is produced and registered within chat function
  • Queue is created within chat function
I think returning internal event to frontend side is a common requirement , but does llamaindex seem not to provide such functionality?

Am I need to implement own implementation as perhaps you have done here, or try CallBack?
@Logan M
What information are you trying to get?
Tbh I would write this myself with a workflow: https://docs.llamaindex.ai/en/stable/module_guides/workflow/#workflows

Theres a very nice streaming api
https://docs.llamaindex.ai/en/stable/module_guides/workflow/#streaming-events

And many examples
https://docs.llamaindex.ai/en/stable/module_guides/workflow/#examples

I love workflows because it makes it easy to expose and customize lower level operations.
I see. I will follow your way.
I can utilize llama deploy if I use workflow to implement the functionality so it seem good.
Thank you! πŸ™
Add a reply
Sign up and join the conversation on Discord