Find answers from the community

Updated 2 months ago

KeyError: 128 · Issue #2 · run-llama/lla...

j
L
A
11 comments
@Logan M what am I doing wrong?
Ah! I forgot I fixed a bug in llama-index while doing that, but forgot to actually add the fix to the library LOL
Plain Text
 node_parser = HierarchicalNodeParser.from_defaults(
    chunk_sizes=[
        large_chunk_size, 
        large_chunk_size // 3,
    ],
    text_splitter_ids=[
        large_chunk_size, 
        large_chunk_size // 3,
    ],
)
@Logan M - Getting a
Plain Text
KeyError: '128'


When I tried the quick fix of declaring the text_splitter_ids (the one in trail) getting a value error. ValueError: Cannot specify both text_splitter_ids and chunk_sizes.

@jerome - Were you able to make it work?
@Avinaash Anand I pushed the real fix already, it should work as it was originally shown in newer versions of llama-index
Plain Text
node_parser = HierarchicalNodeParser.from_defaults(
    chunk_sizes=[
        large_chunk_size, 
        large_chunk_size // 3,
    ]
)
Tried updating the package and using the og syntax. Still stumbling upon the KeyError.

I'm on llama-index-0.8.29.post1.
Are you sure? Worked fine for me locally

Maybe try to uninstall/reinstall?
Plain Text
(llama-index) loganm@gamingpc:~/llama_index_proper/llama_index$ pip show llama_index
Name: llama-index
Version: 0.8.29.post1
...
(llama-index) loganm@gamingpc:~/llama_index_proper/llama_index$ python
Python 3.11.0 (main, Mar  1 2023, 18:26:19) [GCC 11.2.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> from llama_index.node_parser import HierarchicalNodeParser
>>> large_chunk_size=2048
>>> node_parser = HierarchicalNodeParser.from_defaults(
...     chunk_sizes=[
...         large_chunk_size, 
...         large_chunk_size // 3,
...     ]
... )
>>>
Restarted the Kernel, it's working now! Thank You!!! ^_^ @Logan M
Add a reply
Sign up and join the conversation on Discord