Find answers from the community

Updated 8 months ago

anthropic-cookbook/third_party/LlamaInde...

L
a
7 comments
you are sure thats the exact code you are running? nothing changed?
it works fine for me
seems like somehow llm.metadata is None, but this really isn't possible with the code that is in there
i changed nothing
File ~/model_Chat/env/lib/python3.11/site-packages/llama_index/core/indices/prompt_helper.py:117, in PromptHelper.from_llm_metadata(cls, llm_metadata, chunk_overlap_ratio, chunk_size_limit, tokenizer, separator)
103 @classmethod
104 def from_llm_metadata(
105 cls,
(...)
110 separator: str = " ",
111 ) -> "PromptHelper":
112 """Create from llm predictor.
113
114 This will autofill values like context_window and num_output.
115
116 """
--> 117 context_window = llm_metadata.context_window
119 if llm_metadata.num_output == -1:
120 num_output = DEFAULT_NUM_OUTPUTS

AttributeError: 'NoneType' object has no attribute 'context_window'
I am using latest llama index version and installed it in a virtual env to have latest dependencies
Sus, something must be different in your code

Copy-pasting the code into a fresh google colab, it works fine for me
https://colab.research.google.com/drive/1iHRrCLfn69xctauU9KBmcp03LtCnQaho?usp=sharing
Add a reply
Sign up and join the conversation on Discord