Header Ads

Handling Large File Uploads in Node.js with Chunked Uploads – Step-by-Step Guide

 Step-by-Step Guide to Handling Large File Uploads in Node.js with Chunked Uploads

Learn how to handle large file uploads in Node.js using chunked uploads. Step-by-step guide with code examples to ensure stable, memory-efficient file handling.


1. Why Large File Uploads Can Crash Your Node.js Server

When a user uploads a large file (like a video or high-resolution image), Node.js may try to load it entirely into memory before saving it. This can:

  • Causes high RAM usage
  • Lead to Out of Memory errors
  • Crash the server if traffic is high

Solution: Use chunked uploads so the file is processed in smaller parts rather than as a single huge payload.


2. How Chunked Upload Works

Chunked uploading breaks a file into smaller parts (chunks) and sends them sequentially. On the server:

  1. Each chunk is received and temporarily stored.

  2. Once all chunks arrive, the server combines them into the final file.


3. Setting Up Node.js Project

mkdir large-file-upload
cd large-file-upload
npm init -y
npm install express multer fs-extra

4. Server Code for Chunked Uploads

const express = require('express');

const multer = require('multer');

const fs = require('fs-extra');

const path = require('path');


const app = express();

const upload = multer({ dest: 'temp/' }); // Temporary storage


// Endpoint to handle each chunk

app.post('/upload-chunk', upload.single('chunk'), async (req, res) => {

  const { chunkIndex, totalChunks, fileName } = req.body;

  const chunkPath = path.join(__dirname, 'temp', `${fileName}-${chunkIndex}`);

  await fs.move(req.file.path, chunkPath);

  res.send({ status: 'Chunk received' });

});


// Endpoint to merge chunks after all are uploaded

app.post('/merge-chunks', async (req, res) => {

  const { totalChunks, fileName } = req.body;

  const finalPath = path.join(__dirname, 'uploads', fileName);

  await fs.ensureDir(path.join(__dirname, 'uploads'));


  for (let i = 0; i < totalChunks; i++) {

    const chunkPath = path.join(__dirname, 'temp', `${fileName}-${i}`);

    const data = await fs.readFile(chunkPath);

    await fs.appendFile(finalPath, data);

    await fs.remove(chunkPath);

  }

  

  res.send({ status: 'File merged successfully', filePath: finalPath });

});


app.listen(3000, () => console.log('Server running on http://localhost:3000'));


5. Frontend: Splitting Files into Chunks

Here’s an example using JavaScript in the browser:

async function uploadFile(file) {

  const chunkSize = 2 * 1024 * 1024; // 2MB per chunk

  const totalChunks = Math.ceil(file.size / chunkSize);


  for (let i = 0; i < totalChunks; i++) {

    const chunk = file.slice(i * chunkSize, (i + 1) * chunkSize);

    const formData = new FormData();

    formData.append('chunk', chunk);

    formData.append('chunkIndex', i);

    formData.append('totalChunks', totalChunks);

    formData.append('fileName', file.name);


    await fetch('/upload-chunk', { method: 'POST', body: formData });

  }


  await fetch('/merge-chunks', {

    method: 'POST',

    headers: { 'Content-Type': 'application/json' },

    body: JSON.stringify({ totalChunks, fileName: file.name })

  });


  alert('Upload complete!');

}


6. Benefits of Chunked Uploads

✅ Prevents server crashes by controlling memory usage
✅ Supports resume-able uploads in case of network issues
✅ Handles huge files (e.g., 10GB+) without timeout problems


7. Production Tips

  1. Use NGINX to handle request buffering efficiently.
  2. Set proper file size limits in Express.
  3. Implement upload progress tracking for a better user experience.
  4. Consider AWS S3 multipart uploads for cloud storage.

Post a Comment

0 Comments