Stream
Stream represent a sequence of data that can be processed incrementally.
Stream allows you to process data as it arrives, reducing latency, also avoid to overwhelm memory. Think of uploading a large file, with streams you can upload the file to OSS as the browser start uploading.
Node.js has Stream support for a long time. Now Stream is a web standard, but it's not the same Stream. Let's take a look at the standard Streams.
ReadableStream
Readable streams are source of data that can be read from. Examples include files and network responses.
const stream = new ReadableStream({
start(controller) {
controller.enqueue('Hello')
controller.enqueue('world!')
controller.close()
}
})
const reader = stream.getReader()
while (true) {
const {done, value} = await reader.read()
if (done) {
break
}
console.log(value)
}
Stream is async iterable:
for await (const chunk of stream) {
console.log(chunk)
}
WritableStream
Writable streams are destinations for data to be written to. Examples includes files and network requests.
const readableStream = new ReadableStream({
start(controller) {
['Hello', 'world'].forEach(chunk =>
controller.enqueue(chunk)
controller.close()
}
})
const writableStream = new WritableStream({
write(chunk) {
console.log(chunk)
}
})
readableStream.pipeTo(writableStream)
TransformStream
Transform streams allow you to modify data as it flows through a stream.
const readableStream = new ReadableStream({
start(controller) {
controller.enqueue('hello')
controller.enqueue('world')
controller.close()
}
})
const uppercaseTransformStream = new TransformStream({
async transform(chunk, controller) {
controller.enqueue(chunk.toUpperCase())
}
})
const transformedStream = readableStream.pipeThrough(uppercaseTransformStream)
for await (const chunk of transformedStream) {
console.log(chunk)
}