Technology Apr 30, 2026 · 3 min read

Slashed My Automation Suite from 9 Hours to 1 Hour with This Simple Caching Trick

We've all been there: you build an amazing automation suite, hit "Run", and realize it's going to take until next Tuesday to finish. Last week, I faced a massive bottleneck in my CI/CD pipeline. Here’s how I optimized a 9-hour test suite down to just 1 hour using a "Base Cache" strategy....

DE
DEV Community
by Pau Dang
Slashed My Automation Suite from 9 Hours to 1 Hour with This Simple Caching Trick

We've all been there: you build an amazing automation suite, hit "Run", and realize it's going to take until next Tuesday to finish.

Last week, I faced a massive bottleneck in my CI/CD pipeline. Here’s how I optimized a 9-hour test suite down to just 1 hour using a "Base Cache" strategy.

The Problem: The npm install Nightmare

I'm building a Node.js Quickstart Generator. To ensure quality, I have to validate 720 different combinations of technologies (Clean Architecture, MVC, TypeScript, JavaScript, MySQL, MongoDB, Kafka, etc.).

For every single test run, the script was doing a fresh npm install.

  • The Result: 9 hours of total execution time.
  • The Culprit: Redundant network fetches and disk I/O for the same packages over and over again.

The Solution: Smart "Base Caching"

The mindset shift was simple: Stop treating every project as a unique snowflake.

Even with 720 combinations, they share 95% of the same core dependencies. Here is the 3-step workflow I implemented:

1. Identify the "Base"

I grouped projects by their heaviest dependencies: Language + Database. This resulted in only 8 unique Base Caches (e.g., TypeScript_PostgreSQL).

2. Bootstrap via Copy

Instead of running npm install from scratch, the script now uses this optimized logic, code example: https://github.com/paudang/nodejs-quickstart-structure/blob/feature/oauht2-google-github/scripts/lib/validation-core.js#L500

// 1. Define the Base Cache key (Language + DB)
const baseHashKey = `${config.language}_${config.database}`;
const cachePath = path.join(testDir, `node_modules_base_${baseHashKey}`);

// 2. If a base cache exists, copy it in seconds
let usedCache = false;
if (await fs.pathExists(cachePath)) {
    await fs.copy(cachePath, path.join(projectPath, 'node_modules'));
    usedCache = true;
}

// 3. Run npm install (Use --prefer-offline for ultra-fast delta updates)
const npmCmd = usedCache 
    ? 'npm install --prefer-offline --no-audit' 
    : 'npm install --no-audit';

await runCommand(npmCmd, projectPath);

// 4. Save the cache if it's the first run
if (!usedCache) {
    await fs.copy(path.join(projectPath, 'node_modules'), cachePath);
}

Because most of the packages are already in the node_modules folder, npm just performs a quick delta update.

The Results

The optimization was a game-changer:

  • Execution Time: 9 Hours ➡️ 1 Hour.
  • Storage Efficiency: 80GB (if full-cached) ➡️ 1.2GB.
  • Dev Experience: I can now run the full validation suite multiple times a day instead of once a week.

Key Takeaways for Developers

  1. Find the Real Bottleneck: Don't micro-optimize your code if your Network/IO is the one killing your productivity.
  2. Think in "Deltas": If you can't cache everything, cache the 90% and compute the rest.
  3. Automate the Automation: Always monitor your scripts. If it's slow, it’s a bug.

How do you handle heavy node_modules in your CI/CD? I'd love to hear your tricks in the comments! 👇

If you found this helpful, feel free to give it a ❤️ and follow for more DevOps and Performance tips!

DE
Source

This article was originally published by DEV Community and written by Pau Dang.

Read original article on DEV Community
Back to Discover

Reading List