Find answers from the community

Home
Members
.assets.
.
.assets.
Offline, last seen 2 months ago
Joined September 25, 2024
Hello! I tried asking kapa, but maybe I'm on ignore (or, more likely, I did it wrong, lol). Anyway, what I wanted to ask is that many of the PDFs that I'm attempting to ingest are encoded in various formats... for example streamlter [ /ASCII85Decode /FlateDecode ] /Length 16927 /Subtype /Type1C >>

This results in nonsense being ingested when I try to load it. How would I go about detecting and decoding the various encoding formats while ingesting PDFs?
1 comment
.
going from refine to llm lingua + summarize basically turned a 4.5 minute subquestion chain into <1 minute with the same quality on my pipeline
35 comments
.
L
Has anyone here used the LongLLMLinguaPostprocessor? It definitely increases the density and quality of information that the LLM retains from the context, but it seems to clobber the metadata, so I can't cite any sources.

Just wondering if anyone is having a similar issue w/ the metadata.
51 comments
.
L
I keep running into an issue where whenever I set use_async=True on for my query engine, I encounter an error of asyncio.exceptions.CancelledError

The relevant portions of the stack seem to be

Plain Text
File "/path/to/lib/python3.11/site-packages/llama_index/query_engine/sub_question_query_engine.py", line 226, in _aquery_subq
    response = await query_engine.aquery(question)
...
  File "/path/to/lib/python3.11/site-packages/llama_index/response_synthesizers/refine.py", line 325, in aget_response
    response = await self._agive_response_single(
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/path/to/lib/python3.11/site-packages/llama_index/response_synthesizers/refine.py", line 430, in _agive_response_single
    structured_response = await program.acall(
                          ^^^^^^^^^^^^^^^^^^^^
  File "/path/to/lib/python3.11/site-packages/llama_index/response_synthesizers/refine.py", line 79, in acall
    answer = await self._llm.apredict(
...
  File "/path/to/lib/python3.11/site-packages/openai/_base_client.py", line 1536, in post
    return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/path/to/lib/python3.11/site-packages/openai/_base_client.py", line 1315, in request
    return await self._request(
           ^^^^^^^^^^^^^^^^^^^^
  File "/path/to/lib/python3.11/site-packages/openai/_base_client.py", line 1339, in _request
    response = await self._client.send(
98 comments
.
L
f