Log in
Log into community
Find answers from the community
View all posts
Related posts
Did this answer your question?
π
π
π
Powered by
Hall
Inactive
Updated 8 months ago
0
Follow
anthropic-cookbook/third_party/LlamaInde...
anthropic-cookbook/third_party/LlamaInde...
Inactive
0
Follow
a
anupamaze
8 months ago
Β·
Complete code that i m running:
https://github.com/anthropics/anthropic-cookbook/blob/main/third_party/LlamaIndex/Basic_RAG_With_LlamaIndex.ipynb
L
a
7 comments
Share
Open in Discord
L
Logan M
8 months ago
you are sure thats the exact code you are running? nothing changed?
L
Logan M
8 months ago
it works fine for me
L
Logan M
8 months ago
seems like somehow llm.metadata is None, but this really isn't possible with the code that is in there
a
anupamaze
8 months ago
i changed nothing
a
anupamaze
8 months ago
File ~/model_Chat/env/lib/python3.11/site-packages/llama_index/core/indices/prompt_helper.py:117, in PromptHelper.from_llm_metadata(cls, llm_metadata, chunk_overlap_ratio, chunk_size_limit, tokenizer, separator)
103 @classmethod
104 def from_llm_metadata(
105 cls,
(...)
110 separator: str = " ",
111 ) -> "PromptHelper":
112 """Create from llm predictor.
113
114 This will autofill values like context_window and num_output.
115
116 """
--> 117 context_window = llm_metadata.context_window
119 if llm_metadata.num_output == -1:
120 num_output = DEFAULT_NUM_OUTPUTS
AttributeError: 'NoneType' object has no attribute 'context_window'
a
anupamaze
8 months ago
I am using latest llama index version and installed it in a virtual env to have latest dependencies
L
Logan M
8 months ago
Sus, something must be different in your code
Copy-pasting the code into a fresh google colab, it works fine for me
https://colab.research.google.com/drive/1iHRrCLfn69xctauU9KBmcp03LtCnQaho?usp=sharing
Add a reply
Sign up and join the conversation on Discord
Join on Discord