Find answers from the community

Updated last year

save index to pickle format

At a glance

The community members are discussing issues with pickling in Llama Index, specifically the error "cannot pickle 'builtins.CoreBPE' object". One community member shared a link to an issue thread on GitHub, and another community member noted that pickling may not work due to the tiktoken tokenizer being attached to many objects. The community members are exploring potential workarounds, such as removing the tiktoken tokenizer from the node parser and prompt helper, but a simple solution has not been found yet.

Useful resources
Has anyone been able to get pickle to work with Llama Index? Seems like everyone gets a 'cannot pickle 'builtins.CoreBPE' object' error
r
G
L
9 comments
@Greg Tanaka possible to share the code? I know someone who tried a week back and it worked for him.
I don't have a simple example, but here is the issue thread on this: https://github.com/run-llama/llama_index/issues/886
I think this is a solved issue?
I don't think pickling will work for a bit. Too many objects have a tiktoken tokenizer attached to them (which is causing this error)
@Logan M any possible work arounds in the meantime?
Mmm not easily. You need to rip tiktoken out of the node parser and prompt helper in the service context
It would be nice if tiktoken updated to be pickleable
@Logan M so how can support stateless services?
Add a reply
Sign up and join the conversation on Discord