Find answers from the community

D
Dondo
Offline, last seen 3 months ago
Joined September 25, 2024
Hey all. Through refactoring I'm upgrading to the latest LlamaIndex version and I'm suddenly getting the error type_to_struct must be provided if type is index struct. while trying to manually construct a GPTListIndex. The code roughly looks like:
Plain Text
    index_struct = GPTListIndex.index_struct_cls.from_dict(data.get("index_struct", {}))
    docstore = DocumentStore.load_from_dict(data.get("docstore", {"docs": {}}))

    return GPTListIndex(index_struct=index_struct, docstore=docstore)

where data roughly looks like:
Plain Text
{
  "index_struct": {
    "text": null,
    "doc_id": "56c33884-3aaa-49a2-af48-43afd7e31bd1",
    "embedding": null,
    "extra_info": null,
    "nodes": [
      {
        "text": "Me: message\nYou:other message",
        "doc_id": "1b8ec128-8117-4318-a8b9-1c791b163296",
        "embedding": null,
        "extra_info": null,
        "index": 0,
        "child_indices": [],
        "ref_doc_id": "484c1022-0c19-4513-88f1-1bb4349bfbda",
        "node_info": {
          "start": 0,
          "end": 113
        }
      }
    ]
  },
  "docstore": {
    "docs": {
      "56c33884-3aaa-49a2-af48-43afd7e31bd1": {
        "text": null,
        "doc_id": "56c33884-3aaa-49a2-af48-43afd7e31bd1",
        "embedding": null,
        "extra_info": null,
        "nodes": [
          {
            "text": "Me: message\nYou:other message",,
            "doc_id": "1b8ec128-8117-4318-a8b9-1c791b163296",
            "embedding": null,
            "extra_info": null,
            "index": 0,
            "child_indices": [],
            "ref_doc_id": "484c1022-0c19-4513-88f1-1bb4349bfbda",
            "node_info": {
              "start": 0,
              "end": 113
            }
          }
        ],
        "__type__": "list"
      }
    }
  }
}

I tried passing type_to_struct={ "list": GPTListIndex }) but that produced (AttributeError: type object 'GPTListIndex' has no attribute 'from_dict'). Any guidance is appreciated πŸ™
1 comment
D
Hey all, I'm trying to use LlamaIndex as a memory module for a ChatGPT LangChain predictor. Could anyone tell me if I am on the right track? I'm having trouble getting my head around it πŸ˜…

Plain Text
    chat_history_index = GPTListIndex(
      index_struct=index_struct,
      docstore=docstore,
      # Disabled, see the Edit: in this Discord post
      # llm_predictor=ChatGPTLLMPredictor()
    )

    memory = GPTIndexChatMemory(
        index=chat_history_index, 
        memory_key="history", 
        ai_prefix="AI",
        human_prefix="Human",
        query_kwargs={"response_mode": "compact"},
        return_source=True,
        return_messages=True
    )

    llm_chain = ConversationChain(
        llm=ChatOpenAI(**open_ai_params),
        prompt=chat_prompt,
        memory=memory,
    )

    chain.run(input=input)


Predictions are returned successfully, but memory does not appear to be used when prompting the LLM. Memory is written to the GPTListIndex.

EDIT: Logan M explained that ChatGPTLLMPredictor is broken/deprecated, so please disregard the following.

At this point I'm getting the following error:

Plain Text
  File "/Users/dondo/Library/Caches/pypoetry/virtualenvs/vana-gpt-me-IE1VmXUs-py3.10/lib/python3.10/site-packages/llama_index/llm_predictor/base.py", line 222, in predict
    formatted_prompt = prompt.format(llm=self._llm, **prompt_args)
AttributeError: 'ChatGPTLLMPredictor' object has no attribute '_llm'. Did you mean: 'llm'?


Using the latest llama-index and langchain packages
7 comments
D
L