How to use Streams in Node.js

by Success Ibekwe


Updated Thu Jul 06 2023

How to use Streams in Node.js

The concept of streams is not unique to node.js alone, but it’s a concept that has been in existence, it was first implemented in the UNIX operating system, allowing programs to interact with each other, passing data from one program to another in I/O operations.

However, it is seen as a difficult software engineering concept to work with because of its numerous principles.

Understanding stream principles in Node.js and how to implement them opens you up to a versatile approach to handling data sources that are in use universally, these principles are necessary if you’re looking to develop a transparent server.

In this article, we are going to be breaking down streams in the way that you will be to understand, implement, and also teach your teammates.

Table of content:

  1. Introduction.
  2. What is stream
  3. Importance of using stream
  4. Different types of streams
  5. How to create a readable
  6. How to create a writable stream
  7. How to get data from a read stream
  8. How to send data from a write stream
  9. Signaling a write stream that you have finished
  10. Conclusion

So with no further ado let’s delve right in.

What are streams

Streams are methods used in handling writing/reading files, network communications, or any kind of end-to-end information exchange in an efficient way.

They form the fundamental concepts that power the node.js application.

With streams, data can be exchanged in small parts, which reduces a lot of memory usage.

Unlike the traditional way of reading files into memory all in a program at once before processing its content, which can be an issue if is not enough memory space to contain the files.

However, Streams on the other hand do not keep all the files in the memory all at once before processing the content, it reads chunks of data and processes the contents of the file piece by piece 

This pattern of breaking down files into chunks of data becomes more of an advantage to us when working with large amounts of data because we no longer need to worry about memory space being enough to contain the files.

Let’s examine the usage of streams in the YouTube app. 

Youtube offers streaming services, with this service you don’t need to download the videos or audio feeds all at once, but you are allowed to watch the videos or listen to the audio immediately, this is possible because your browser can receive the videos and audio as a continuous flow of chunks.

Composability in streams:

The youtube example is just a basic example of what we can achieve with streams, but beyond just working with media or big data, streams make “composability” of code possible.

Composability is an approach where several components are combined in a certain way to produce the same type of result and streams give us the power to do that. “The power of composability”

Why are streams important?

There is four major importance of using streams over using other methods of data manipulation and they include:

  • Memory Efficiency: With streams, you don’t need to load large amounts of data into memory before you can be able to process it.
  • Temporal Efficiency: With streams, it takes lesser time for data processing. You don’t have to wait until the entire data payload is available. You can start processing it as soon as you have it.
  • Composability feature: With the stream composability feature, we can build microservices in node.js. With composability, we can carry out complex applications that interact and interconnect with data between different pieces of code.
  • Used in building applications: With streams, we can create real-world applications such as video streaming apps.

Before we continue let’s write a basic code example of a stream.

const http = require ( 'http' )
const fs = require ( 'fs' )
const server = http.createServer ( function ( req , res ) {
  fs.readFile ( __dirname + '/data.txt' , ( err , data ) => {
     res.end ( date )
server.listen ( 3000 )

The code above reads files from a disk. 

The fs Node.js module gives us the ability to read files, and whenever a new connection is established to our HTTP server we can serve the file with HTTP.

let’s break down the code:

readFile()reads the entire contents of the file, and when it’s done reading invokes a callback function.

res.end(data)inside the callback will return the contents of the file to the HTTP client.

Using the above method will cause our operation to take time if the files are large. So to mitigate the problem, let’s write the same code using a stream method.

const http = require ('http')
const fs = require ('fs')
const server = http.createServer ( ( req , res ) => {
   const stream = fs.createReadStream ( __dirname + '/data.txt' )
   stream.pipe ( res )
server.listen ( 3000 )

Now you can see that we have more control, we start “streaming” the files to the HTTP client as soon as we have chunks of data ready to be sent. This is better especially if we have a large file we don’t have to wait until the file is completely read.

Types of streams in node.js:

There are four types of streams and they include:

  • Readable stream: These are streams from which we can read data. it allows us to receive data but not send data. This is also known as piping. Data that are sent into a read stream, are buffered until a consumer starts reading the data. Using fs.createReadStream() lets us read the contents of a file.
  • Writable stream: These are streams from which we can write data. it allows us to send data but not receive data from it. The fs.createWriteStream() lets us write data to a file.
  • Duplex: These are streams that are composed of both Readable and Writable streams.
  • Transform: A transform Stream is similar to a Duplex stream, but they carry out data transformation as it is written and as it is been read too.

How to create a readable stream

To create a readable stream we start by creating a stream object:

const Stream = require ( 'stream' )
const readableStream = new Stream.readable( )

…after creating a stream object we can go ahead and implement the _read method.

readableStream._read = ( ) => { }

Now that the stream is initialized, we can send data to it:

readableStream.push ( 'hi!' )
readableStream.push ( 'Success!' )

Types of Readable streams:

There are two types of readable streams which include:

  1. Flowing readable stream(Flowing mood).
  2. Paused readable stream(Paused mood).

Flowing readable stream:

A flowing readable stream uses events from the event emitter to provide data for the application and allows this application to flow continuously.

The different types of events used in the flowing mode include:

  • Data event — Data events are called whenever data is available to be read by a stream.
  • End event — The end event is called whenever the stream reaches the end of the file, and no more data is available to read.
  • Error event — The error event is called whenever there is an error during the read stream process. This event can also be called when using writable streams.
  • Finish event — The finish event is called when all data has been flushed to the underlying system.

Paused readable stream:

The pause mode uses the read() method to receive the next chunk of data from the stream and this read() method has to be called explicitly since the stream is not read continuously in a pause mood.

It is also important to note that streams that start in the pause mood can be switched to the flowing mood using the following steps:

  • By adding a ‘data’ event handler to the stream.
  • By calling the stream.resume() method.
  • By calling the stream.pipe() method, which sends data to writable streams.

How to create a writable stream:

To create a writable stream, we have to extend the base object Writable and implement its _write() method.

First, we create a Stream Object like so:

const Stream = require ( 'stream' )
const writableStream = new Stream.writable ( )

…and then we are on to implementing the _write:

writableStream._write = ( chunk , encoding , next ) => {
console.log ( chunk . toString ( ) )
next ( )

Now you can pipe a read stream into it:

process.stdin.pipe ( writableStream )

How to retrieve data from a read stream:

Using the Writable stream, we can retrieve data from the readable stream using the code example:

const Stream = require ( 'stream' )
const readableStream = new Stream.readable ( {
   read( ) { }

const writableStream = new Stream.writable ( )

writableStream._write = ( chunk , encoding , next ) => {
   console.log (chunk.toString ( ) )
   next( )
readableStream.pipe ( writableStream )
readableStream.push (Hello!')
readableStream.push ('Success!')

You can directly retrieve a read stream using the event readable object

readableStream.on( 'readable' , ( ) => {
   console.log ( ( ) )
} )

How to send data to a write stream:

We can send data using the write()stream method:

writableStream.write( 'Hello!\n' )

Use the method end():

With the end() method we are to send a signal to the write stream letting it know we have finished writing. The code for it is as follows:

const Stream = require ('stream')
const readableStream = new Stream.readable( {
   read( ) { }
const writableStream = new Stream.writable( )
writableStream._write = ( chunk , encoding , next ) => {
   console . log ( chunk.toString( ) )
   next( )
readableStream.pipe( writableStream )
readableStream.push( 'Hi!' )
readableStream.push( 'Success!' )
writableStream.end( )


You can learn more about Streams in these resources below:

  1. Node.js Streams: Everything you need to know
  2. Understanding Streams in Node.js


From this article, we’ve been able to understand what streams are, with streams data are been read in junks and files being processed bit by bit, which solves the issue of running out of memory storage.

Streams open us up to a world of developing a real-time, performant application.

We also looked at how to create readable and writable streams.

With the concept taught in this article, you should be able to start using streams in your apps. The more you can apply what you’ve learned, the more advanced you become.

Tell us how you have used streams or what you’re planning to do:

Whenever you're ready

There are 4 ways we can help you become a great backend engineer:

The MB Platform

Join 1000+ backend engineers learning backend engineering. Build real-world backend projects, learn from expert-vetted courses and roadmaps, track your learnings and set schedules, and solve backend engineering tasks, exercises, and challenges.

The MB Academy

The “MB Academy” is a 6-month intensive Advanced Backend Engineering BootCamp to produce great backend engineers.

Join Backend Weekly

If you like post like this, you will absolutely enjoy our exclusive weekly newsletter, Sharing exclusive backend engineering resources to help you become a great Backend Engineer.

Get Backend Jobs

Find over 2,000+ Tailored International Remote Backend Jobs or Reach 50,000+ backend engineers on the #1 Backend Engineering Job Board

Backend Tips, Every week

Backend Tips, Every week