-
Notifications
You must be signed in to change notification settings - Fork 662
feat(streams/unstable): new CappedDelimiterStream() #6890
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Codecov Report✅ All modified and coverable lines are covered by tests. Additional details and impacted files@@ Coverage Diff @@
## main #6890 +/- ##
=======================================
Coverage 94.14% 94.14%
=======================================
Files 582 583 +1
Lines 42750 42806 +56
Branches 6807 6815 +8
=======================================
+ Hits 40245 40301 +56
Misses 2455 2455
Partials 50 50 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
|
Is there a particular use case for this stream? |
Yes. I'd like to offer a streaming FormData encoder and decoder and in it I found something like this useful. The normal DelimiterStream isn't suitable as it would likely produce huge chunks, possibly too big, when processing the files defeating the purpose of streaming.
I'd need you to be more specific on what this
I'm not attached to the name and open to suggestions if you think there is something better to communicate its intended behaviour. |
Sounds interesting to me. Can you add some note about such situation in API doc? (An example illustrating that situation would aslo be very helpful.) |
|
I added a note explaining more, but I can't really think of an example that's simple to demonstrate the point. |
|
@BlackAsLight Thanks for adding notes. Now I see the utility of this transform, however as @timreichen pointed, the name of the class sounds a bit confusing to me too. The meaning of |
|
I think there are a few things to contemplate: If this Stream is limited to the chunk size max and a delimiter, this might as well just be an option on const delimStream = new DelimiterStream(new TextEncoder().encode("|"), { maxChunkByteSize: 5 })I would argue this is fairly limited in usage, so Maybe a better option would be some kind of stream
.pipeThrough(new DelimiterStream(new Uint8Array(2), { preserveChunks: true }))
.pipeThrough(new MaxChunkByteSizeStream(5)) |
|
What about:
I think CappedDelimiterStream could be good. |
It would still queue them all in before serving them up which defeats the purpose of streams |
Maybe I lack some understanding of stream functionality, but how would enqueueing all chunks defeat the purpose of streams? All chunks go through each transform stream serially without blocking no? |
I think there might have been some misunderstanding in communication. DelimiterStream at the moment is a Uint8Array stream and you're suggesting this preserve chunks option is going to change that to an object stream of |
Oh, I see. I guess having a separate stream class makes sense then, however maybe we could generalize that stream class so it could work not only with delimiters but is customizable to parse a json stream for example? |
|
@BlackAsLight
That sounds a bit overly general to me, but please feel free to explore such API if you feel strongly. I think |
kt3k
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
This pull request offers a new
TransformStreaminto the mix. While we do already haveDelimiterStream, it isn't suitable if your delimiter is unlikely to appear for long stretches of bytes, meaning you could end up with huge chunks at a time. This pull request offersLimitDelimiterStreamwhich as the name implies, offers a maximum length that a chunk can be.Example