Find answers from the community

Updated 10 months ago

anthropic-cookbook/third_party/LlamaInde...

At a glance

The post shares a complete code that a community member is running, which is hosted on GitHub. The comments discuss the code, with one community member confirming that the code works fine for them, while another community member suggests that the llm.metadata is None, which should not be possible with the provided code. The community members also mention that they are using the latest version of LlamaIndex and have installed it in a virtual environment. One community member provides a link to a Google Colab notebook where the code works fine. However, there is no explicitly marked answer in the comments.

Useful resources
L
a
7 comments
you are sure thats the exact code you are running? nothing changed?
it works fine for me
seems like somehow llm.metadata is None, but this really isn't possible with the code that is in there
i changed nothing
File ~/model_Chat/env/lib/python3.11/site-packages/llama_index/core/indices/prompt_helper.py:117, in PromptHelper.from_llm_metadata(cls, llm_metadata, chunk_overlap_ratio, chunk_size_limit, tokenizer, separator)
103 @classmethod
104 def from_llm_metadata(
105 cls,
(...)
110 separator: str = " ",
111 ) -> "PromptHelper":
112 """Create from llm predictor.
113
114 This will autofill values like context_window and num_output.
115
116 """
--> 117 context_window = llm_metadata.context_window
119 if llm_metadata.num_output == -1:
120 num_output = DEFAULT_NUM_OUTPUTS

AttributeError: 'NoneType' object has no attribute 'context_window'
I am using latest llama index version and installed it in a virtual env to have latest dependencies
Sus, something must be different in your code

Copy-pasting the code into a fresh google colab, it works fine for me
https://colab.research.google.com/drive/1iHRrCLfn69xctauU9KBmcp03LtCnQaho?usp=sharing
Add a reply
Sign up and join the conversation on Discord