Find answers from the community

Updated 8 months ago

Great work on the new agents release

At a glance
Great work on the new agents release guys πŸ‘

@Logan M - I've yet to dive into the source for message queue , but could we integrate a custom MQ for the inter-agent communication messages to be posted to? eg. rabbitmq / SQS, SNS or spawn a lambda call?
1
W
m
L
16 comments
Good to know! Anything with the AWS stack?
Not fully sure on this, but code abstraction once fully added may help you to setup any message queue.
Glad you like it! Definitely early days but it's been fun building this out.

Yea the hope is the base class well lend itself well to any message queue (probably still some minor kinks of course)

Would love to have Kafka as well, and whatever else πŸ‘
@WhiteFang_Jr @Logan M I’ve got a working draft for AWS SQS πŸ’ͺ🏻 but I’ll need some more time to flesh out some more tests.

What would be the best way to contribute, push to a branch so you guys can review?
Also any lessons learned from the previous 2 integrations w/ rabbit and Kafka would go a long way πŸ˜„you guys are killer devs, cheers
Amazing! 🀩 :LlamaIndex:

Yea best way to contribute is fork the repo and contribute a PR from your fork πŸ’ͺ

@nerdai has actually been the one killing it on all the recent messsage queue integrations
good to meet you @nerdai , I been reading some of your content on linkedin ✌🏻
Hey @marty mcfly! Nice to meet you. Looking forward to your AWS SQS integration!!
In terms of lessons from past integrations, I'd say its just that since we're trying to be an async first library, we should make use of async clients. I think for RabbitMQ, I was using sync pika first but making that work with our stuff felt a bit awkward. Switching to aio-pika was better. So hopefully there is a good asyncio equipped client for AWS SQS?
first stab, this is how far i got, let me know if you have access issues: https://github.com/run-llama/llama-agents/commit/be52f7b7b76eb166c79bd48d896f9293f4974b40#commitcomment-144680990 cheers ✌️
Awesome! You just need to raise a PR from your fork on llama-agents
oops almost forgot - here it is:
https://github.com/run-llama/llama-agents/pull/158
^ fyi @nerdai @Logan M πŸ˜„
sweet! will take a look at it shortly πŸ™‚
Add a reply
Sign up and join the conversation on Discord