-
Notifications
You must be signed in to change notification settings - Fork 1.2k
[Feature] Add batch event comsumption #13261
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
[Feature] Add batch event comsumption #13261
Conversation
|
Mooved here @Megafredo |
|
Hello @Renizmy, thank you for the switch! |
|
Fixed, sorry |
Codecov Report✅ All modified and coverable lines are covered by tests. Additional details and impacted files@@ Coverage Diff @@
## master #13261 +/- ##
===========================================
+ Coverage 16.26% 30.85% +14.59%
===========================================
Files 2846 2911 +65
Lines 412135 192432 -219703
Branches 11512 39246 +27734
===========================================
- Hits 67035 59378 -7657
+ Misses 345100 133054 -212046
Flags with carried forward coverage won't be shown. Click here to find out more. ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
Megafredo
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hello @Renizmy, thanks for your work!
This new method for streams that allows batch processing will make a lot of people happy!
|
@Renizmy FYI, we'd like to improve a bit and refactor the code before merging ! :) |
|
Hi @Renizmy, Thank you for your contribution. As @helene-nguyen mentioned, we'd like the code to be refactored before merging. The main concern is that the new class ( Instead of creating a new class and method, we suggest implementing a Then each batch-capable connector (in regards of the targeted API) could be able to use this adapter to receive batch of message instead individual message. Usage (assuming wrapper is named self.helper.listen_stream(message_callback=self.process_message)
--->
batch_callback = self.helper.create_batch_callback(self.process_message_batch, self.batch_size, self.batch_timeout, self.max_batches_per_minute)
self.helper.listen_stream(message_callback=batch_callback)Would you be open to making this change? |
xfournet
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for the update! I made some comments, I will resume the PR after theses first feebacks have been processed.
|
Hi @xfournet , Thanks for the review! All points addressed: Changes to the rate limiter have led to simplifications. I haven't implemented any code related to RL for basic stream consumption (out of scope?). |
| if not isinstance(max_per_minute, int) or max_per_minute <= 0: | ||
| raise ValueError("max_per_minute must be a positive integer") | ||
| rate_limiter = RateLimiter(helper=self, max_per_minute=max_per_minute) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
should use create_rate_limiter here ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Indeed ... Sorry
Related to: #13372