Find answers from the community

Updated 2 months ago

Import

At a glance

The community member is experiencing an issue when upgrading the llama_deploy[rabbitmq] library from version 0.2.0 to a higher version. The error is related to the TYPE_CHECKING being always False, which prevents the necessary library from being imported. Another community member, Logan M, suggests that the issue needs a pull request and code change, and advises the original poster to open an issue. However, the original poster later comments that the error seems to be resolved, but they are now facing a new issue where the service is not consuming the queue, and messages are not being forwarded to the workflow. The original poster asks if they should log another ticket issue.

hi, I got error when using llama_deploy[rabbitmq] when run deploy_core. In version 0.2.0 it works but when I try to upgrade to version higher, it got errror because of the TYPE_CHECKING is always False, so it cannot import necessary library. how to solve this problem
Attachment
IMG_7328.png
L
R
7 comments
You can't solve the problem. It needs a PR and code change. Please open an issue and I should be able to get to it tomorrow
thanks @Logan M
the error looks like resolved
but got a problem that when run deploy workflow
the service not consume queue
so when I push message by call api, the message request not forward to queue for workflow to handle
do I need to log another ticket issue ? @Logan M
Add a reply
Sign up and join the conversation on Discord