Streams — Node.JS

What are streams ?

Lokesh
4 min readJun 22, 2020

Streams are used to move the data from server to client bit by bit (In chunks), Instead of loading everything in the server memory to transfer the data. Let’s understand this with an example,

Say we have a business to transfer the water. And we need to transfer the water from bucket A to bucket B, What you will do is move the water from source to your factory(auxiliary storage) and from there you move it to the destination.

Store water from S to B and then transfer from B to D

Hmm, But we have another approach which is better than this, Why can’t we introduce a pipe between S and D and that pipe we call as a stream.

Transfer the water in chunks

Stream is nothing but it is used to transfer the data in chunks to the client. When we use this the entire video is not stored at once in the server memory.

Types of Streams

  • Readable Stream
  • Writable Stream
  • Duplex Stream
  • Transform Stream

Readable Stream

Readable streams are used to read the data from the source and feeds the data into the pipeline bit by bit. Let’s create our own stream which reads the data in chunks and when the chunk is read from array the event is emitted which prints the chunk value in the console.

But we generally use the predefined read streams, we have readable streams for files, TCP, etc. For example, in the below code we are creating a read stream for some sample video file and the length of each chunk is logged in the console.

Writable Streams

Writable streams are used to capture the data coming from the readable stream and do something with it. Like the readable streams, the writable streams are also everywhere http client req-resp, file systems, etc. Let’s take an example where we will be reading a video file in chunks and creating a copy using the writable streams.

Backpressure

Say we are pouring the water from source to destination, and we are pouring the water too fast and the water begins to overflow, Now we should we continue this? “No” because if we pouring the water even now then we may loose the water. Instead what we have to do is stop pouring the water in until the water which is already there is moved on. Similarly when we are using a write stream to write a chunk of data, then we must take a break when there is a backpressure and then continue again when the pipe is free again.

Backpressure

Piping Streams

Now instead of having a bunch of event listeners, we can simply use pipes. We can pipe a readable stream to writable stream. Say for example whatever text we type in the console will be directly written to the text file. Let’s see the code sample. Pipes automatically handles the backpressure for us.

In the above code whatever we are giving as input in the console is written into the `sample.txt` file with the help of pipe.

Duplex Stream

ReadableStream stream the data into the duplex and duplex stream can also write the data. Duplex stream is the middle section in the pipe line. Duplex streams can be used when we want to do any of the operations in between reading and writing streams. Let’s say that we want to log every bit of data as report.

Here in this code we are recording every bit which is transferred from read stream to the write stream by using the event listener and we pipe the report which is a new object of pass through in between the write and read streams

Transform Stream

Instead of simply transferring the data from read stream to write stream, transform stream is used to change the data. This can be useful when the data is read and encoded chunk by chunk and then again to pass it to write stream it can be decoded.

--

--