Log in
Log into community
Find answers from the community
s
F
Y
a
P
3,278
View all posts
Related posts
Did this answer your question?
๐
๐
๐
Powered by
Hall
Inactive
Updated last year
0
Follow
save index to pickle format
save index to pickle format
0
Follow
G
Greg Tanaka
last year
ยท
Has anyone been able to get pickle to work with Llama Index? Seems like everyone gets a 'cannot pickle 'builtins.CoreBPE' object' error
r
G
L
9 comments
Share
Open in Discord
r
ravitheja
last year
@Greg Tanaka possible to share the code? I know someone who tried a week back and it worked for him.
G
Greg Tanaka
last year
I don't have a simple example, but here is the issue thread on this:
https://github.com/run-llama/llama_index/issues/886
r
ravitheja
last year
I think this is a solved issue?
G
Greg Tanaka
last year
no
L
Logan M
last year
I don't think pickling will work for a bit. Too many objects have a tiktoken tokenizer attached to them (which is causing this error)
G
Greg Tanaka
last year
@Logan M any possible work arounds in the meantime?
L
Logan M
last year
Mmm not easily. You need to rip tiktoken out of the node parser and prompt helper in the service context
L
Logan M
last year
It would be nice if tiktoken updated to be pickleable
G
Greg Tanaka
last year
@Logan M so how can support stateless services?
Add a reply
Sign up and join the conversation on Discord
Join on Discord