Log in
Log into community
Find answers from the community
View all posts
Related posts
Did this answer your question?
😞
😐
😃
Powered by
Hall
Inactive
Updated 3 months ago
0
Follow
Hi, upgrading to 0.9.15.post2 broke the
Hi, upgrading to 0.9.15.post2 broke the
Inactive
0
Follow
b
big_ol_tender
last year
·
Hi, upgrading to 0.9.15.post2 broke the pickle-ability: “AttributeError: can’t pickle local object ‘LLM.set_completion_to_prompt.<locals>.<lambda>’”
L
b
14 comments
Share
Open in Discord
L
Logan M
last year
ahhhh
b
big_ol_tender
last year
Sorry lol
L
Logan M
last year
I hate that python multiprocessing relies on pickles
L
Logan M
last year
this is the hackiest thing in the library lol
L
Logan M
last year
Will fix :PSadge:
L
Logan M
last year
adding a unit test this time
b
big_ol_tender
last year
Ty 🫡
L
Logan M
last year
surprisingly I can't replicate this in a unit test. But either way, I know the issue lol
L
Logan M
last year
ah there we go
L
Logan M
last year
it happens for chat engines/agents
L
Logan M
last year
weird
L
Logan M
last year
https://github.com/run-llama/llama_index/pull/9588
b
big_ol_tender
last year
Ty!!
b
big_ol_tender
last year
Stay tender
Add a reply
Sign up and join the conversation on Discord
Join on Discord