Find answers from the community

s
F
Y
a
P
Updated last year

save index to pickle format

Has anyone been able to get pickle to work with Llama Index? Seems like everyone gets a 'cannot pickle 'builtins.CoreBPE' object' error
r
G
L
9 comments
@Greg Tanaka possible to share the code? I know someone who tried a week back and it worked for him.
I don't have a simple example, but here is the issue thread on this: https://github.com/run-llama/llama_index/issues/886
I think this is a solved issue?
I don't think pickling will work for a bit. Too many objects have a tiktoken tokenizer attached to them (which is causing this error)
@Logan M any possible work arounds in the meantime?
Mmm not easily. You need to rip tiktoken out of the node parser and prompt helper in the service context
It would be nice if tiktoken updated to be pickleable
@Logan M so how can support stateless services?
Add a reply
Sign up and join the conversation on Discord