Find answers from the community

Updated 4 months ago

Displaying Citation Links on Hover

At a glance

The community member is using CitationQueryEngine to provide citations in their responses and wants to display links to the relevant parts of each source when hovering over a citation. They are having trouble determining which source belongs to which citation, as the response.source_nodes field contains sources labelled differently than the citation numbers. Another community member suggests writing code to link the citation number to the corresponding source node.

The community member also wants to use CitationQueryEngine in a chat context with history, but is unsure how to do this, as there doesn't seem to be a way to pass the query engine into the as_chat_engine() method. Another community member suggests re-implementing the query engine using a workflow, which would allow adding chat history as an input and using llm.chat directly instead of a response synthesizer.

Useful resources
Hi all, I'm using CitationQueryEngine to provide citations in my responses. On my UI, I want to be able to hover over a citation (e.g. [2]) and display links to the relevant part of each source.

Using response.source_nodes, what is the best way to know which source belongs to which citation? The text field contains the citation number (e.g. "Source 2: blah blah"), but I would have to parse this to extract the citation number.

I can't see any other field that just has the citation number itself. Is there one?
r
L
7 comments
Also, the response only has one citation ([2]), but response.source_nodes has sources labelled Source 2, Source 3, and Source 5.
right, there will be many sources, but the LLM might only choose to cite a subset

You just need to write some code to link [2] to Source 2 node in this case
No worries, thanks.
The other question I have about CitationQueryEngine is that I want to use it in a chat context with history (i.e. index.as_chat_engine()). Is there a simple way to do this? There doesn't seem to be a way to pass the query engine into as_chat_engine().
Yea not as straightforward tbh. Tbh I would honestly re-implement the query engine itself with a workflow (https://docs.llamaindex.ai/en/stable/module_guides/workflow/#workflows), and add chat history myself

There's even an example of the citation query engine as a workflow
https://docs.llamaindex.ai/en/stable/examples/workflow/citation_query_engine/

I would just add chat history as an input, and replace using a response synthesizer with using llm.chat directly
Thanks for the links!
I did end up getting something working with CitationQueryEngine by looking at the implementation of as_chat_engine() and constructing a tool from the query engine, but it feels a bit hacky as I then had to reach into the response and get the raw response from the tool. So I'll give your suggestion a whirl. Thanks again.
Add a reply
Sign up and join the conversation on Discord