Bun is Fast. Your Event Loop is Not.

March 19, 2026By imkyssa5 min read1 views

Bun is Fast. Your Event Loop is Not.
"Software engineering is more than just centering a div." — howtocenterdiv.com
TLDR: Bun wins benchmarks. Your app still bottlenecks on DB connections, blocking CPU work, and N+1 queries. Switching runtimes before fixing those is optimizing the wrong layer. Migrate bun install and bun test today — safe, immediate wins. Move the runtime only when the profiler tells you to.

Bun vs Node.js Performance: What Actually Matters

Bun is faster than Node.js — that's not marketing. Raw HTTP throughput, startup time, and tooling speed are all measurably better. But in real-world backend applications, performance is almost never limited by those things. It's limited by:
  • database query latency
  • connection pool exhaustion
  • blocking CPU work in request handlers
  • N+1 queries hiding inside loops
If those are your bottlenecks, switching from Node.js to Bun will not improve your latency. The benchmark won't tell you that. This article will.

What Bun Actually Speeds Up

Bun is fast at specific things:
  • Startup time — cold starts are noticeably quicker than Node.js
  • Raw I/O throughput — built on JavaScriptCore and native APIs, file and network I/O is faster
  • Toolingbun install, bun test, bun build beat npm/jest/webpack by a significant margin
  • Simple HTTP servers — if your handler does minimal work, you'll see the benchmark numbers in production
The benchmark scenario: receive request → do almost nothing → send response. Bun wins that race. For edge functions, lightweight proxies, or static file serving, it's the right call.
Production backend work rarely looks like that.

What the Event Loop Actually Is

Node.js and Bun share the same concurrency model. Single thread, one thing at a time. The Event Loop is what makes this workable — instead of blocking on I/O, it parks the work and picks it back up when the result is ready.
code
1Request comes in 2 → handler starts 3 → hits await db.query(...) 4 → Event Loop: "ok, I'll park this, do something else" 5 → DB responds 6 → Event Loop: "pick this back up" 7 → handler continues 8 → response sent
The key word is delegates. The Event Loop doesn't speed up your DB query. It just doesn't sit around waiting for it. That distinction matters more than which runtime you picked.

The Promise.all Problem Nobody Talks About

Here's where experienced developers get burned:
javascript
1// "parallel" — looks efficient 2const [users, orders, inventory] = await Promise.all([ 3 db.query('SELECT * FROM users WHERE active = true'), 4 db.query('SELECT * FROM orders WHERE status = pending'), 5 db.query('SELECT * FROM inventory WHERE stock < 10'), 6]);
Three queries fire "at the same time." The Event Loop is not blocked. You feel good about it. Here's what's actually happening:
code
1Connection pool size: 10 (default in most PG clients) 2 3Request 1: Promise.all — takes 3 connections 4Request 2: Promise.all — takes 3 connections 5Request 3: Promise.all — takes 3 connections 6Request 4: needs 3 connections, pool has 1 left → waits 7Request 5: needs 3 connections, pool has 0 left → waits 8... 950 concurrent requests: 150 connections needed, 10 available 10 → queue builds 11 → p99 spikes 12 → you blame the DB
The Event Loop is doing its job. The connection pool is the bottleneck. Bun doesn't help here.
ApproachConnections per requestScales?
Sequential awaits1 at a timeYes, but slow
Promise.all (3 queries)3 simultaneousDepends on pool size
Single JOIN query1Usually yes
Read replica + caching1 (cache hit = 0)Yes
Three round trips to the DB — even parallel ones — usually loses to one well-written JOIN.

When the Event Loop Actually Blocks

It only stays fast if you don't block the thread:
javascript
// Blocks everything for the duration — don't do this const hash = crypto.pbkdf2Sync(password, salt, 100000, 64, 'sha512'); // Offloads to libuv thread pool — do this const hash = await crypto.pbkdf2(password, salt, 100000, 64, 'sha512');
javascript
// Parsing a 50MB JSON payload — blocks const data = JSON.parse(hugeString); // Tight CPU loop — blocks for (let i = 0; i < 1_000_000_000; i++) { /* ... */ }
When the thread is blocked, every other request waits. 500ms of CPU work on one request adds 500ms to every concurrent request. This is where Bun vs Node.js becomes irrelevant. Both runtimes, one thread, same problem.
For CPU-heavy work: worker threads.
javascript
import { Worker } from 'worker_threads'; const worker = new Worker('./heavy-task.js', { workerData: payload }); worker.on('message', (result) => resolve(result));

The Memory Question Benchmarks Skip

Bun runs on JavaScriptCore (JSC). Node.js runs on V8. Both are fast — but garbage collection strategies differ, and under sustained high traffic that shows up in ways throughput numbers don't capture.
V8's GC has been hardened at Google scale for over a decade. Incremental, predictable, tuned for long-running server processes. JSC's GC is excellent for short-lived workloads — browsers, scripts, edge functions. Under sustained high-concurrency server load, it's less predictable:
code
1Node.js under load: 2 → memory grows, GC kicks in, memory drops 3 → predictable sawtooth 4 5Bun under sustained load: 6 → memory can grow faster between GC cycles 7 → RSS climbs without dropping in some workloads 8 → you restart the process, it's fine — until it isn't
This is an active area of Bun development, not a permanent flaw. But if your app holds large in-memory objects, processes heavy JSON volumes, or runs near its memory ceiling — watch RSS and heap over time, not just p50 latency. A runtime that's 2x faster but needs restarting every 6 hours is not an upgrade.

So When Does Bun Actually Matter?

WorkloadBottleneckBun helps?
CRUD API with DB queriesDB + network I/OMarginally
File upload/download serviceI/O throughputYes
Edge function / lightweight proxyRaw HTTPYes, noticeably
CPU-intensive processingSingle threadNo
Dev tooling (install, build, test)Startup + I/OYes, significantly
High-concurrency with connection poolingPool exhaustionNo
Bun is not a fix for a slow application. It's a boost for applications that are already fast and need to go faster at the I/O layer.

Before You Switch Runtimes

code
11. Where does your p99 latency come from? 2 → Profile first. Don't guess. 3 42. Are your DB queries using indexes? 5 → EXPLAIN ANALYZE before anything else. 6 73. What's your connection pool size vs concurrent request count? 8 → pool_size = (core_count * 2) + effective_spindle_count 9 104. Are you doing synchronous CPU work in request handlers? 11 → Move it to workers. 12 135. Are you making N+1 queries inside loops? 14 → Fix this before touching the runtime.
Work through that list. If the bottleneck is genuinely raw I/O throughput after all of that — Bun is worth evaluating. Otherwise, you're optimizing the wrong layer.

The Migration Path

bun install and bun test are safe today, zero risk. The DX improvement is real and immediate. Running your full server on Bun in production is a separate decision.
Check your dependencies first. Some native Node.js modules don't work on Bun yet. Run bun run index.ts and see what breaks. Compatibility is improving fast but it's not 100%.
For greenfield projects: Bun is becoming a reasonable default. For existing Node.js apps: migrate the tooling first, benchmark the server second, move the runtime only if the numbers justify it.

Bun is genuinely fast. The benchmark isn't lying. But most applications need fewer queries, smarter caching, and connection pools that aren't exhausted — not a faster runtime. Fix those first. Then talk about Bun.

FAQ

Is Bun faster than Node.js? Yes — in raw HTTP throughput, startup time, and tooling. In real-world backend apps with DB queries and connection pools, the difference is often negligible.
Should I switch from Node.js to Bun? Only if your profiler shows I/O throughput as the bottleneck. Most apps are limited by database latency and CPU work, not the runtime.
Does Bun improve database performance? No. Database latency is independent of the JavaScript runtime. Switching to Bun won't make your queries faster.
Is Bun production-ready in 2026? For many workloads, yes. Ecosystem compatibility still has gaps — test your dependencies before migrating.
Where does Bun actually win? Edge functions, lightweight proxies, static file serving, and dev tooling. These are workloads where raw I/O and startup time are the actual bottleneck.

You might also like