Streams are one of the most powerful concepts in backend programming. The Stream module provides an API for implementing streaming data.
Instead of reading a massive 2GB video file entirely into your computer's memory (which would crash your server), a stream reads the data chunk by chunk, processing the pieces over time. Netflix and YouTube rely heavily on this exact concept!
There are four fundamental stream types:
fs.createReadStream()).fs.createWriteStream()).The fs module utilizes streams to read massive files efficiently. Because streams are based on the Events module, we can listen for 'data' events every time a new chunk arrives.
const fs = require('fs');
const demoFile = 'stream-demo.txt';
// Create a small demo file so the example is self-contained
fs.writeFileSync(demoFile, 'Node.js streams read data in chunks.\n'.repeat(4));
// Create a readable stream with small chunks
const readStream = fs.createReadStream(demoFile, {
encoding: 'utf8',
highWaterMark: 20
});
// Listen for data chunks
readStream.on('data', (chunk) => {
console.log('Received chunk:', JSON.stringify(chunk));
console.log('Chunk length: ' + chunk.length + ' characters');
});
readStream.on('end', () => {
console.log('Finished reading the entire file.');
});
You can connect a Readable stream directly to a Writable stream using the .pipe() method. This is perfect for copying files quickly without consuming excess memory.
const readStream = fs.createReadStream('original.txt');
const writeStream = fs.createWriteStream('copy.txt');
// Pipe the reading directly into the writing!
readStream.pipe(writeStream);
What is the primary benefit of using streams?