Deeper testing of Bun's performance and compatibility against Node.js

; Date: Fri Jul 22 2022

Tags: Node.JS

Bun is a new project aiming to be compatible with Node.js, but with huge performance gains. Not even a month into public availability, and people are claiming both the Node.js and Deno projects are dead.

What does it take to "kill" a software platform? People are still using COBOL, for example, and how many predictions have been about the death of Perl or PHP or Java?

The Bun project makes big claims that it would be compatible with the Node.js platform while giving huge performance benefits. If true, that could easily sway a lot of software engineers to abandon Node.js. But that would take several years to unfold. Node.js is in a strong mature position and the Bun project has a lot of work to do before it could fully supplant Node.js.

But, what will happen when Bun becomes stable and mature enough to run complex applications currently running on Node.js?

What I'm interested in doing is evaluating whether Bun can run my applications as well as Node.js. I'm surely not the only person with those questions.

These claims have caught the attention of a lot of people. I've seen on YouTube a bunch of people running simple performance tests then crowing about how these PROVE that Bun is a lot faster than Node.js.

Of course, simple performance tests do not prove very much. As I say later, it's a fallacy to put a lot of weight on a simple test.

That led me to attempt to run a test of a complex application on Bun. Namely, I've developed a static website generator system (AkashaCMS) which I use to build several websites, including and AkashaCMS is complex enough to provide a good test scenario. If Bun could successfully run AkashaCMS, with higher performance, that would prove the claims by the Bun team. The idea was sound, but the result was flawed.

I wrote an article under the belief that I'd managed to render my website using Bun, when it seems that instead I was accidently running Node.js and my test was invalid.

This article is meant to revisit what I tried to do, and to present some more carefully constructed performance tests. Along the way I've found some bugs in Bun, which have been filed in the Bun issue queue.

Currently the bugs and incomplete features in Bun prevent using it to run AkashaCMS. But I've been able to execute parts of it using Bun. As we'll see this shows performance gains in some areas.

What is Bun, versus Node.js

Node.js is a platform for server-side execution of JavaScript applications. It came on the scene in 2009 making big claims about how a single threaded event-driven architecture could offer system performance benefits over typically complex thread-based architectures.

Node.js is JavaScript running outside of the browser. Since 2009, a large ecosystem of tools and frameworks have grown around Node.js spanning all kinds of software development tools, web application frameworks, database ORM layers, and even GUI application toolkits.

Bun is like Node.js, but different. Where Node.js is based on the V8 engine from Chrome, and is written in C++, Bun is based on JavaScriptCore from Safari, and is written in Zig. You'd probably never heard of Zig before hearing of Bun, and neither had I, but Zig claims to have many benefits over other system programming languages like C++. Otherwise, Bun aims to fulfill the Node.js use case, which is to support running modern JavaScript outside the web browser.

This isn't the first attempt to run Node.js on a different JavaScript engine. A few years ago there was an attempt to run Node.js on top of ChakraCore ( ( GitHub), which was abandoned when Microsoft dropped ChakraCore from Edge.

Bun's selling point is:

  1. Compatibility with Node.js including direct use of packages (even native code packages) for Node.js.
  2. Huge performance benefits: a) It is written in Zig, rather than C++, which has some benefit; b) It is built on the JavaScriptCore engine. That engine is supposed to be faster than the V8 engine at the heart of both Node.js and Deno.
  3. It supports direct execution of TypeScript and JSX code.

If the Bun team is able to fully implement these selling points, I imagine that many in the Node.js/Deno community will switch to Bun.

But, from a practical matter Node.js has had 12 years of refinement, improvements, and bug fixing. Bun, as we'll see, has a lot of catching up to do.

Any experienced software engineer has probably used many different programming tools and platforms in their career. We are constantly evaluating which tools to use, and most of us are wise enough to see the claims made by the Bun team and know that they're making some awfully big claims. I sure hope those claims pan out, but at this phase

Bun could kill Node.js and/or Deno

There are already people claiming that Bun will kill both the Deno and Node.js projects. The rationale is what I just named: If Bun is able to implement all of Node.js with a high degree of compatibility, while maintaining huge performance benefits, then they clearly have a winner.

Because Bun embraces the node_modules infrastructure, it has a big advantage over Deno. Deno has a hard time leveraging the hundreds of thousands of packages available that way. That's a valuable resource we've all built together, Bun will be able to fully leverage it.

For example, one can use npm install to set up a node_modules directory and then immediately use it with Bun. The goal is to use bun install with the same package.json to do the same thing, but with greater performance.

Bun won't be able to kill Node.js immediately. As we see below, there are many missing features and many bugs to fix. And, there are all the processes and logistical support that must be developed for Bun to become a self-sustaining project.

What will Node.js developers do with an alternative that's compatible with the existing ecosystem, but much faster?

The trap of simple performance tests

Already there are several videos on YouTube giving Bun a first try. Every video I've watched shows them running a few simple commands, and saying gosh wow this is so fast.

There is a well known fallacy of an overly simplistic performance test. Does running a simple script with Bun mean it is hugely faster than Node.js in real applications? That's the fallacy. To verify that Bun is indeed faster requires more in-depth testing than a few simple examples.

Bun's existing performance tests

The Bun source tree includes a suite of benchmark tests. To run these tests:

$ git clone 
$ cd bun/bench
$ bun install
$ bun run ffi
$ bun run log
$ bun run gzip
$ bun run async
$ bun run sqlite

I don't have the space to show the full set of results. But, lets look at the SQLite tests:

david@davidpc:~/Projects/bun/bun/bench/sqlite$ bun run bench
$ bun run bench:bun && bun run bench:node && bun run bench:deno
$ $BUN bun.js
[0.02ms] ".env"
cpu: Intel(R) Core(TM) i7-5600U CPU @ 2.60GHz
runtime: bun 0.1.4 (x64-linux)

benchmark                        time (avg)             (min … max)
SELECT * FROM "Order"         43.62 ms/iter   (40.67 ms … 47.89 ms)
SELECT * FROM "Product"      121.84 µs/iter  (87.83 µs … 928.85 µs)
SELECT * FROM "OrderDetail"  499.15 ms/iter  (470.1 ms … 620.22 ms)
$ $NODE node.mjs
cpu: Intel(R) Core(TM) i7-5600U CPU @ 2.60GHz
runtime: node v18.6.0 (x64-linux)

benchmark                        time (avg)             (min … max)
SELECT * FROM "Order"        108.33 ms/iter (106.17 ms … 113.98 ms)
SELECT * FROM "Product"       318.2 µs/iter (285.53 µs … 775.32 µs)
SELECT * FROM "OrderDetail"     2.13 s/iter       (2.02 s … 2.37 s)
$ $DENO run -A --unstable deno.js
cpu: Intel(R) Core(TM) i7-5600U CPU @ 2.60GHz
runtime: deno 1.23.4 (x86_64-unknown-linux-gnu)

benchmark                        time (avg)             (min … max)
SELECT * FROM "Order"         274.7 ms/iter (263.29 ms … 342.62 ms)
SELECT * FROM "Product"      490.34 µs/iter   (377.47 µs … 7.49 ms)
SELECT * FROM "OrderDetail"      1.6 s/iter       (1.43 s … 2.12 s)

This particular benchmark makes SELECT queries against a SQLite database. For Node.js the test uses better-sqlite, and the Deno test uses "". By contrast, Bun uses their own SQLite implementation directly integrated in the Bun sources.

These are impressive performance differences.

Incompletenesses in Bun hindering deeper testing

My goal is running larger applications to assess the compatibility and performance of Bun. In my case that application is AkashaCMS, the static website generator I use for,, and a couple other websites. AkashaCMS does server-side DOM processing using Cheerio, it uses various template engines (EJS and Nunjucks primarily), and more. In other words, it would provide a good test case for Bun's compatibility with Node.js.

But, there are many compatibility issues:

$ bun ./node_modules/akasharender/cli.js copy-assets config.js 

error: Cannot find package "child_process" from "/home/david/ws/"

The akasharender command uses Commander to parse arguments, and it cannot work because the child_process package does not exist. Commander is an extremely popular package for writing CLI tools in Node.js. Missing the child_process built-in package prevents any tool built around Commander from running.

FWIW, the Bun issue queue contains ( Buns roadmap listing a bunch of stuff which hasn't been implemented.

I had hoped to simply use AkashaCMS to evaluate Bun. Since that can't be done, the next best solution is to select out parts of AkashaCMS with which to do evaluation.

Text processing with template engines, performance and compatibility between Bun and Node.js

The code discussed here is in: (

The akashacms/akashacms-perftest is used for performance testing of AkashaCMS. In the bench directory I intend to create some benchmark-like tests for certain features of AkashaCMS.

For example, consider:

import { bench, run } from "mitata";

let people = ['geddy', 'neil', 'alex'];


bench('literal', () => { return `${people.join(', ')}`; });

// EJS

import * as ejs from 'ejs';

bench('ejs-join', () => {
    ejs.render('<%= people.join(", "); %>', { people: people });
bench('ejs-list', () => {
    <% people.forEach(function (person) {
        %><li><%= person %></li><%
    }) %>
`, { people: people });

The Bun project uses the Mitata benchmark execution framework. To help with comparing results from tests with Bun, I'll use Mitata as well.

The first test is to benchmark text substitution in a template string.

The second is to use a couple scenarios with the EJS template engine. In both cases it is taking an array of values, formatting that array in a couple ways. I've implemented similar code for multiple template engines, as will be evident from the results below.

Two additional tests are:

  • markdown-render which uses the MarkdownIT package to process Markdown.
  • cheerio for server-side DOM processing using cheerio, a major AkashaCMS feature.

The result shows these performance differences:

$ npm run bench

> bench@1.0.0 bench
> npm-run-all render:node render:bun

> bench@1.0.0 render:node
> node render-node.mjs

cpu: Intel(R) Core(TM) i7-5600U CPU @ 2.60GHz
runtime: node v18.6.0 (x64-linux)

benchmark            time (avg)             (min … max)
literal          133.73 ns/iter (119.76 ns … 661.32 ns)
ejs-join          18.15 µs/iter  (14.89 µs … 583.95 µs)
ejs-list           30.6 µs/iter  (25.98 µs … 398.45 µs)
handlebars-join    5.95 µs/iter   (4.78 µs … 371.31 µs)
handlebars-list    5.97 µs/iter   (4.76 µs … 411.31 µs)
liquid-join       30.26 µs/iter    (18.04 µs … 3.66 ms)
liquid-list       91.91 µs/iter     (64.02 µs … 1.3 ms)
nunjucks-join     46.25 µs/iter    (26.71 µs … 1.09 ms)
nunjucks-list     93.28 µs/iter    (65.62 µs … 1.22 ms)
markdown-render   38.78 µs/iter  (28.93 µs … 603.53 µs)
cheerio          130.17 µs/iter    (78.42 µs … 5.04 ms)

> bench@1.0.0 render:bun
> bun render-bun.js

cpu: Intel(R) Core(TM) i7-5600U CPU @ 2.60GHz
runtime: bun 0.1.4 (x64-linux)

benchmark                 time (avg)             (min … max)
literal               129.64 ns/iter (107.87 ns … 534.11 ns)
handlebars-join-once    4.34 µs/iter     (3.07 µs … 1.17 ms)
handlebars-list-once    4.66 µs/iter     (3.44 µs … 1.22 ms)
liquid-join            38.63 µs/iter    (21.23 µs … 2.63 ms)
liquid-list           125.84 µs/iter    (87.41 µs … 2.12 ms)
cheerio                68.81 µs/iter    (43.25 µs … 2.09 ms)

There are two things to take from this:

  1. I wasn't able to implement all scenarios on both Node.js and Bun
  2. There is a significant performance gain for the Cheerio test, and less significant for the others

Why couldn't the full set of scenarios be implemented on both? There were segmentation faults for certain scenarios under Mitata. An issue has been filed in Bun's issue queue describing the problem: (

For certain template engines the combination of Mitata plus that template engine caused a segmentation fault. A script running just the template engine with no Mitata code executed correctly.

As far as performance, Bun shows performance gains for the scenarios which work on both platforms. The gain in Cheerio performance is very interesting.

Chokidar exposes issues

Chokidar is a popular package for scanning directory trees and dynamically noticing changes. In AkashaCMS, it is used to notice when files are changed and to do automatic rebuilds. It plays a core role, and I wanted to know if there was any difference in execution time to scan a given directory between Node.js and Bun.

Tests for this are here: (

A test case is this:

import { inspect } from 'util';
import { default as chokidar } from 'chokidar';

let watcher;

const start = new Date();
let count = 0;

try {
    await new Promise((resolve, reject) => {
        try {
            watcher =[2]);
            .on('error', async (error) => {
            .on('add', (fpath, stats) => {
                // console.log(`add ${fpath} ${inspect(stats)}`);
            .on('change', (fpath, stats) => {
                // console.log(`change ${fpath} ${inspect(stats)}`);
            .on('ready', async () => {
                // console.log(`ready`);
                await close();

                const finish = new Date();

                console.log(`time ${(finish - start) / 1000} seconds - ${count} files`);

        } catch (err) { reject(err); }

} catch (errr) { console.error(errr); }

async function close() {
    await watcher.close();
    watcher = undefined;

This uses Chokidar to scan a directory passed on the command line. For my test I had it scan the node_modules directory to make sure there wwere lots of files to scan. Chokidar emits several events depending on what happens in the filesystems it is scanning. The ready event is emitted when it's completed the initial scan. We use that event to close the Chokidar instance and calculate the time.

This means it could serve as a performance comparison between Node.js and Bun, but as we'll see Bun cannot execute Chokidar.

With Node.js:

$ node choke.mjs ../../node_modules/
time 0.763 seconds - 3498 files

But, with Bun the test doesn't succeed:

$ bun choke.mjs ../../node_modules/
241 |     );
242 |   }
243 | 
244 |   filterPath(entry) {
245 |     const {stats} = entry;
246 |     if (stats && stats.isSymbolicLink()) return this.filterDir(entry);
 TypeError: stats.isSymbolicLink is not a function. (In 'stats.isSymbolicLink()', 'stats.isSymbolicLink' is undefined)
      at filterPath (/home/david/Projects/akasharender/stacked-directories/node_modules/chokidar/index.js:246:17)
      at /home/david/Projects/akasharender/stacked-directories/node_modules/readdirp/index.js:141:79

What this means is the Stats object returned by fs.stats does not contain the isSymbolicLink function. This function has existed since Node.js v10.10.0, and certainly should be there.

The problem has already been fixed -- my bug report ( -- Note that at the bottom there is a link to an issue tracking implementation of more Stats object methods. As of this writing Bun 0.1.5 has shipped, and the isSymbolicLink function now exists.

Unfortunately there is a new error occurring with Chokidar.

114 |         sysPath.resolve(path, evPath), KEY_LISTENERS, sysPath.join(path, evPath)
115 |       );
116 |     }
117 |   };
118 |   try {
119 |     return, options, handleEvent);
TypeError: is not a function. (In ', options, handleEvent)', '' is undefined)
      at createFsWatchInstance (/home/david/Projects/akasharender/stacked-directories/node_modules/chokidar/lib/nodefs-handler.js:119:11)
      at /home/david/Projects/akasharender/stacked-directories/node_modules/chokidar/lib/nodefs-handler.js:166:14
      at _watchWithNodeFs (/home/david/Projects/akasharender/stacked-directories/node_modules/chokidar/lib/nodefs-handler.js:331:13)
      at _handleFile (/home/david/Projects/akasharender/stacked-directories/node_modules/chokidar/lib/nodefs-handler.js:395:17)
      at /home/david/Projects/akasharender/stacked-directories/node_modules/chokidar/lib/nodefs-handler.js:637:15

Indeed, we can verify it does not exist as so:

import * as fs from 'fs';

On Node.js this prints a Function object, but on Bun it currently prints undefined. Issue filed: (


One of the buns in the oven, ( ( per the priority list), is described this way:

  1. Improve Node.js compatibility. Express needs to work. Popular packages like chalk, debug, discord.js need to work. child_process and net need to be implemented. We need more tests. These packages should be supported via implementing the lower-level Node.js APIs and not by hacking in compatibility layers. Longer-term, Bun might want to implement Express in native code, but today lets just get the primitives working well.

And, yes, the above clearly demonstrates that Node.js compatibility is lacking currently. But, that's just a matter of fixing bugs or adding missing features. As the priority list says, having more people working on the project will help clear up these issues.

If the Bun project will allow me to suggest a couple things:

  • An objective and precise list of compatibility tests. Put another way, it would be excellent to have a list of core Node.js modules, and the current implementation status. For inspiration, look at (
  • Performance measurements should be tracked in a database so we can watch improvements over time.
  • There should be many more performance measurements than currently. (I should think about contributing the scenarios I've described above)

The first suggestion, a Node.js API compatibility test comes from my experience of having worked 10+ years in the Java SE team at Sun Microsystems. ( ( See my article about my transition from Java to Node.js.) There are multiple implementations of Java created by multiple organizations for multiple platforms. Compatibility between the multiple Java implementations is guaranteed by the corresponding TCK's. I don't remember what the acronym means but it is a test of compatibility, or more precisely conformance to a specification.

The Bun team is setting out to create a second implementation of the Node.js API. That puts it in the same role as the independently developed Java implementations.

Us, as the potential customers of either Node.js or Bun need to know the degree of compatibility between the two. Our job may be to maintain the website of a multi-billion-dollar business, and choosing badly between the two could kill the business.

In the Java ecosystem, the conformance/compatibility tests were a big step to establishing credibility. Passing those tests was a big deal, and allow a project or product to call itself Java Compatible.

No such test suite exists for Node.js. Nor is there a formal specification of the Node.js API. The API documentation at is good, but it is not a formal specification. In the Java ecosystem, the conformance tests are written by closely parsing details in the specification.

Over time there could be other attempts to build independent implementations of the Node.js API. Like the Bun team today, those teams will face the same issue of determining the degree of compatibility with Node.js.

In theory, what's required is for a team to comb through the Node.js API documentation. Conformance tests could be developed, ideally as an independent project that could be used by both the Node.js and Bun teams.

However, that would require a lot of work, and who has the funding to pay for such a test suite?

As it stands the Bun team will create whatever tests they feel are appropriate for Bun. If I worked in the Bun team, I'd be looking for a way to leverage tests developed by the Node.js team. I might even do that in a separate repository. The goal would be for both the Bun and Node.js teams to jointly collaborate on a test suite of the Node.js API.

An idea which came to mind is someone could develop a website or GitHub repository through which project owners could declare "Works With Bun"... A growing list of "Works With Bun" packages should help make the case that Bun is a good choice. By what criteria would the owner of the "Works With Bun" initiative validate the truthfulness?

Maybe I digressed too far.

This article has proved two things:

  1. It's too early to attempt to run significant applications on Bun, because of the many missing features.
  2. There are performance gains depending on the feature.

Bottom line is that Bun will succeed if the Node.js community pitches in. There is a lot of work to do before it can be recommended to use Bun for significant applications. The performance gains are very promising.

About the Author(s)

( David Herron : David Herron is a writer and software engineer focusing on the wise use of technology. He is especially interested in clean energy technologies like solar power, wind power, and electric cars. David worked for nearly 30 years in Silicon Valley on software ranging from electronic mail systems, to video streaming, to the Java programming language, and has published several books on Node.js programming and electric vehicles.

Books by David Herron