Bun vs Node Performance

coding performance bun nodejs benchmarking

Drop-In Node to Bun Perfomance Evaluation

Related: Fooselo - Project

Overview

Following post captures performance measurements after migrating a Node.js project to Bun as a drop-in replacement. The goal is to evaluate real-world performance differences across common development tasks.

Goals

Measure and compare performance of:

  1. Install packages (cold & warm cache)
  2. Run unit tests
  3. Build client application
  4. Generate API client

Metrics Captured

  • Median runtime (seconds) and standard deviation (via repeated runs)
  • Wall-clock runtime per run
  • Peak memory (Maximum resident set size in MB)
  • Cache state (cold = cleared caches, warm = after first run)
  • Environment metadata (OS, CPU, RAM, Node version, pnpm version, Bun version, date/time)

Environment

Baseline (Node.js + pnpm)

  • Date: 2026-01-08
  • OS: Fedora 43 (Linux kernel 6.17.12-300.fc43.x86_64)
  • CPU: 11th Gen Intel(R) Core(TM) i7-11800H @ 2.30GHz (8 cores, 16 threads)
  • RAM: 31GB
  • Node.js: v25.0.0
  • pnpm: 10.20.0
  • Disk: NVMe SSD

After Migration (Bun)

  • Date: 2026-01-08
  • Bun: 1.3.5
  • (Other environment same as baseline)

Measurement Methodology

Tools Used

  • hyperfine: For stable repeated timings with statistical analysis
    • Install: sudo apt install hyperfine or cargo install hyperfine
  • GNU time: For memory usage measurements
    • Command: /usr/bin/time -v

Hyperfine  Example

Process

  1. Run each command multiple times (warmup: 3, runs: 10)
  2. Capture median, mean, and standard deviation (hyperfine)
  3. Capture peak memory for at least one representative run (/usr/bin/time -v)
  4. Test both cold (cleared caches/node_modules) and warm (cached) scenarios
  5. Record exact command lines used

Commands Run

Use the automated script: scripts/bench/measure.sh

# Baseline (pnpm)
./scripts/bench/measure.sh "pnpm-install-cold" "rm -rf node_modules && pnpm install"
./scripts/bench/measure.sh "pnpm-install-warm" "pnpm install"
./scripts/bench/measure.sh "pnpm-test-client" "pnpm nx test client --skip-nx-cache"
./scripts/bench/measure.sh "pnpm-build-client" "pnpm nx build client"
./scripts/bench/measure.sh "pnpm-codegen" "pnpm nx run generated-api-client:codegen"

# After Bun migration
./scripts/bench/measure.sh "bun-install-cold" "rm -rf node_modules && bun install"
./scripts/bench/measure.sh "bun-install-warm" "bun install"
./scripts/bench/measure.sh "bun-test-client" "bunx nx test client --skip-nx-cache"
./scripts/bench/measure.sh "bun-build-client" "bunx nx build client"
./scripts/bench/measure.sh "bun-codegen" "bunx nx run generated-api-client:codegen"

Results

Package Installation

VariantCache StateMedian (s)Stddev (s)Peak RSS (MB)Notes
pnpmcold4.0230.103679.401544 packages
pnpmwarm0.8370.015216.61Already up to date
buncold1.6210.027224.013085 packages installed
bunwarm0.1780.006205.94No changes

Cold Install Speedup: 2.5x faster (59% time reduction)
Warm Install Speedup: 4.7x faster (79% time reduction)

Unit Tests (client)

RuntimeMedian (s)Stddev (s)Peak RSS (MB)Notes
pnpm/node5.4960.138230.4453 tests passed
bun4.4600.191225.0853 tests passed

Test Speedup: 1.2x faster (18% time reduction)

Build Client

RuntimeMedian (s)Stddev (s)Peak RSS (MB)Notes
pnpm/node1.9340.032188.75With Nx cache
bun0.9340.015189.83With Nx cache

Build Speedup: 2.1x faster (52% time reduction)

Generate API Client

RuntimeMedian (s)Stddev (s)Peak RSS (MB)Notes
pnpm/node3.1780.062186.36Orval codegen
bun2.1470.027185.47Orval codegen

Codegen Speedup: 1.5x faster (32% time reduction)

Follow-Up Steps

It is obvious that the main potential for reducing unit test execution time lies not in swapping the runtime. Bun offers a test runner, bun:test, as an alternative to Vitest or Jest. The next step is to migrate the project to bun:test and update the evaluation document.

The outlook is promising:

alt text

Sources: x/colinhacks

Bun Test Docs

References