@Logan M
I've attempted to wrap the LLM in a CustomQueryComponent so that I can invoke specifically either chat() or complete(), but when I do I lose the trace provided natively by the llm object.
I don't fully understand CallbackManagers, but it seems like it should be possible to assign a tag to each component so that it shows up in the trace. Can you provide some link to an example of setting this up?
I am leveraging the documentation found here
https://docs.llamaindex.ai/en/stable/examples/pipeline/query_pipeline_memory/#query-pipeline-contructionAny advice? A code snippet to be able to add a label to a component and have it appear in the Phoenix trace would make my life pretty great