Find answers from the community

Updated 2 months ago

Could this be a potential bug or am i

Could this be a potential bug or am i doing something wrong?
im using DocumentSummary Index in combination with its LLM retriever:
Plain Text
    doc_summary_index = DocumentSummaryIndex.from_documents(
        documents=documents,
        transformations=[splitter],
        response_synthesizer=response_synthesizer,
        show_progress=True,
    )

    retriever = DocumentSummaryIndexLLMRetriever(
            index=doc_summary_index,
            llm=llm,
            # choice_select_prompt=None,
            # choice_batch_size=10,
            # choice_top_k=1,
            # format_node_batch_fn=None,
            # parse_choice_select_answer_fn=None,
     )


i somehow get following error:
Plain Text
Traceback (most recent call last):
  File "/home/_DEV/maas-ai-gmbh-new/main.py", line 424, in handleBasicQuestionWithDocumentSummaryIndex
    retrieved_nodes = retriever.retrieve(question)
  File "/home/_DEV/maas-ai-gmbh-new/maas-ai-gmbh-new/lib/python3.10/site-packages/llama_index/core/instrumentation/dispatcher.py", line 274, in wrapper
    result = func(*args, **kwargs)
  File "/home/_DEV/maas-ai-gmbh-new/maas-ai-gmbh-new/lib/python3.10/site-packages/llama_index/core/base/base_retriever.py", line 244, in retrieve
    nodes = self._retrieve(query_bundle)
  File "/home/_DEV/maas-ai-gmbh-new/maas-ai-gmbh-new/lib/python3.10/site-packages/llama_index/core/indices/document_summary/retrievers.py", line 98, in _retrieve
    raw_choices, relevances = self._parse_choice_select_answer_fn(
  File "/home/_DEV/maas-ai-gmbh-new/maas-ai-gmbh-new/lib/python3.10/site-packages/llama_index/core/indices/utils.py", line 104, in default_parse_choice_select_answer_fn
    answer_num = int(line_tokens[0].split(":")[1].strip())
IndexError: list index out of range

following this tutorial: https://docs.llamaindex.ai/en/stable/examples/index_structs/doc_summary/DocSummary/
P
L
3 comments
so it looks like my llm doesnt like the prompt instruction from choice_select_prompt and just doesnt output like its told to...
so for future reference: i fixed it by customizing the selection prompt and the parse_choice_select_answer_fn
Yea, the LLM can sometimes not follow directions (especially when using local LLMs)
Add a reply
Sign up and join the conversation on Discord