Find answers from the community

Updated 2 months ago

Packs


So I am using "EmbeddedTablesUnstructuredRetrieverPack" for my tables use case. Any way to stream the response?
embedded_tables_unstructured_pack.run() is returning the right response but how to stream it?
Also using this pack I can only ask questions and get an answer right. I can't really "Chat"?
L
r
W
5 comments
Packs are really meant to be more examples that you can modify and share.

I'd recommend looking at the source code and understanding what it's doing.

In this case, it's just a query engine. You can set streaming=True here
https://github.com/run-llama/llama_index/blob/bf636a5ab52609b65c1825c19e1f8043b5bc5f45/llama-index-packs/llama-index-packs-recursive-retriever/llama_index/packs/recursive_retriever/embedded_tables_unstructured/base.py#L56

Then, you could put the query engine as an agent tool, or use the retriever in a context chat engine to have chat history
Thanks for this insight @Logan M ! While I did change the code line to "self.query_engine = RetrieverQueryEngine.from_args(self.recursive_retriever, streaming=True)", Do we need changes on the "run" funcitona s well to accomplish streaming. Kinda new here and a bit rusty with concepts AND on a clock. A quick help would be really appreciated! ❀️
after doing that, the response should have a response_gen

Plain Text
response = pack.run(...)
for token in response.response_gen:
  print(token, end="", flush=True)
Dude @Logan M I don't know what WE would do without you!
Add a reply
Sign up and join the conversation on Discord