Getting started with asynchronous read/write between Node.js 8 and MongoDB

By: (plus.google.com) +David Herron; Date: May 23, 2018

Tags: Node.js » Asynchronous Programming

Node.js and MongoDB work very well together, by marrying a JavaScript programming language with a JavaScript-oriented database engine. In this article we'll look at usage patterns for asynchronous coding in Node.js using async functions against the MongoDB database. Async functions are a powerful new feature in JavaScript, and we want to be sure to use them as widely as possible. Async functions gives cleaner database code because results plop into a convenient place.

We will be using Node.js 10 along with the official MongoDB driver ( (www.npmjs.com) https://www.npmjs.com/package/mongodb). While Mongoose is a fine piece of software, the MongoDB driver is an excellent way to interact with the database.

Out of the box the MongoDB driver supports returning Promise objects (or using the Callback paradigm) making it a natural fit for async functions.

Setup

We need a MongoDB instance to play with. It's easy enough to set up a test instance on your laptop. Downloadable builds for the MongoDB Community Edition are readily available and are easy to install, see instructions at: (docs.mongodb.com) https://docs.mongodb.com/manual/installation/

Once installed it's not necessary to setup a background instance of MongoDB. It's easy to run in the foreground as needed.

Run these commands in one terminal window:

$ mkdir mongo
$ cd mongo
$ mkdir data
$ mongod --dbpath data
2018-05-23T23:05:40.195-0700 I CONTROL  [initandlisten] MongoDB starting : pid=13212 port=27017 dbpath=data 64-bit host=MacBook-Pro-4
2018-05-23T23:05:40.196-0700 I CONTROL  [initandlisten] db version v3.4.10
2018-05-23T23:05:40.196-0700 I CONTROL  [initandlisten] git version: 078f28920cb24de0dd479b5ea6c66c644f6326e9
2018-05-23T23:05:40.196-0700 I CONTROL  [initandlisten] OpenSSL version: OpenSSL 1.0.2o  27 Mar 2018
2018-05-23T23:05:40.196-0700 I CONTROL  [initandlisten] allocator: system
2018-05-23T23:05:40.196-0700 I CONTROL  [initandlisten] modules: none
2018-05-23T23:05:40.196-0700 I CONTROL  [initandlisten] build environment:
2018-05-23T23:05:40.196-0700 I CONTROL  [initandlisten]     distarch: x86_64
2018-05-23T23:05:40.196-0700 I CONTROL  [initandlisten]     target_arch: x86_64
2018-05-23T23:05:40.196-0700 I CONTROL  [initandlisten] options: { storage: { dbPath: "data" } }
2018-05-23T23:05:40.197-0700 I STORAGE  [initandlisten] wiredtiger_open config: create,cache_size=7680M,session_max=20000,eviction=(threads_min=4,threads_max=4),config_base=false,statistics=(fast),log=(enabled=true,archive=true,path=journal,compressor=snappy),file_manager=(close_idle_time=100000),checkpoint=(wait=60,log_size=2GB),statistics_log=(wait=0),
2018-05-23T23:05:40.994-0700 I CONTROL  [initandlisten] 
2018-05-23T23:05:40.994-0700 I CONTROL  [initandlisten] ** WARNING: Access control is not enabled for the database.
2018-05-23T23:05:40.994-0700 I CONTROL  [initandlisten] **          Read and write access to data and configuration is unrestricted.
2018-05-23T23:05:40.994-0700 I CONTROL  [initandlisten] 
2018-05-23T23:05:40.995-0700 I CONTROL  [initandlisten] 
2018-05-23T23:05:40.995-0700 I CONTROL  [initandlisten] ** WARNING: soft rlimits too low. Number of files is 256, should be at least 1000
2018-05-23T23:05:41.244-0700 I FTDC     [initandlisten] Initializing full-time diagnostic data capture with directory 'data/diagnostic.data'
2018-05-23T23:05:41.320-0700 I INDEX    [initandlisten] build index on: admin.system.version properties: { v: 2, key: { version: 1 }, name: "incompatible_with_version_32", ns: "admin.system.version" }
2018-05-23T23:05:41.320-0700 I INDEX    [initandlisten] 	 building index using bulk method; build may temporarily use up to 500 megabytes of RAM
2018-05-23T23:05:41.330-0700 I INDEX    [initandlisten] build index done.  scanned 0 total records. 0 secs
2018-05-23T23:05:41.330-0700 I COMMAND  [initandlisten] setting featureCompatibilityVersion to 3.4
2018-05-23T23:05:41.330-0700 I NETWORK  [thread1] waiting for connections on port 27017

Then in another window, run these commands:

$ mongo
MongoDB shell version v3.4.10
connecting to: mongodb://127.0.0.1:27017
MongoDB server version: 3.4.10
Server has startup warnings: 
2018-05-23T23:05:40.994-0700 I CONTROL  [initandlisten] 
2018-05-23T23:05:40.994-0700 I CONTROL  [initandlisten] ** WARNING: Access control is not enabled for the database.
2018-05-23T23:05:40.994-0700 I CONTROL  [initandlisten] **          Read and write access to data and configuration is unrestricted.
2018-05-23T23:05:40.994-0700 I CONTROL  [initandlisten] 
2018-05-23T23:05:40.995-0700 I CONTROL  [initandlisten] 
2018-05-23T23:05:40.995-0700 I CONTROL  [initandlisten] ** WARNING: soft rlimits too low. Number of files is 256, should be at least 1000
> use testDB
switched to db testDB
> db.myCollection.insertOne( { x: 1 } );
{
  "acknowledged" : true,
  "insertedId" : ObjectId("5b0656da1cf80c4022fb3e34")
}
> db.myCollection.find();
{ "_id" : ObjectId("5b0656da1cf80c4022fb3e34"), "x" : 1 }
> 

This connected the default MongoDB client program to the server started previously, and did a couple transactions with the database.

It's easy to kill the MongoDB instance -- simply type CTRL-C and the server dies. It's easy to delete the database if needed, just delete the directory.

Setting up a program to use the MongoDB package

Adding the MongoDB package to your application is as simple as this:

$ mkdir app
$ cd app
$ npm init
... answer questions
$ npm install mongodb --save

For documentation see (www.npmjs.com) https://www.npmjs.com/package/mongodb and http://mongodb.github.io/node-mongodb-native

A module to interact with a MongoDB database

The code in this and following sections is adapted from code shown in my book, Node.js Web Development. We won't show complete working code in this article, but instead snippets pulled from examples in the book.

For CommonJS-style Node.js modules, use this as the starting point of the module

const mongodb = require('mongodb');
const MongoClient = mongodb.MongoClient;

If you instead want to use ES6 modules (which in turn requires Node.js 10.x) use this:

import mongodb from 'mongodb'; 
const MongoClient = mongodb.MongoClient;

Both sets of code import the MongoDB package. A top level object is created, the MongoClient.

var client;

async function connectDB() { 
    if (!client) client = await MongoClient.connect(process.env.MONGO_URL);
    return { 
        db: client.db(process.env.MONGO_DBNAME), 
        client: client
    };
}

We use this function to assist connecting to the database. A global-to-module variable holds the client object. Two environment variables are used to define the database connection and the database name.

The environment variables might be:

MONGO_URL=mongodb://localhost:27017
MONGO_DBNAME=notes

And we need a matching function to close the database connection:

// For an ES6 module
export async function close() {
    if (client) client.close();
    client = undefined;
}
// For a CommonJS module
module.exports.close = async function close() {
    if (client) client.close();
    client = undefined;
}

You might see a message printed to this effect:

(node:19997) DeprecationWarning: current URL string parser is deprecated, and will be removed in a future version. To use the new parser, pass option { useNewUrlParser: true } to MongoClient.connect.

This comes from the MongoDB driver, and is fairly self-explanatory. The fix is:

if (!client) client = await MongoClient.connect(process.env.MONGO_URL, { useNewUrlParser: true });

CRUD: Create

async function C(key, title, body) {
    const { db, client } = await connectDB();
    const collection = db.collection('notes'); 
    let result = await collection.insertOne({ 
        notekey: key, title, body 
    });
    return result;
}

// For ES6 modules
export { C as create };
// For CommonJS modules
module.exports.create = C;

For these examples, the code will maintain a simple collection named notes. It takes three fields, key, title and body. You would of course have some other data to keep track of.

We're inserting an anonymous object corresponding to those fields. MongoDB of course supports any document structure.

This returns the result object as described in the documentation: (mongodb.github.io) http://mongodb.github.io/node-mongodb-native/3.0/api/Collection.html#insertOne

CRUD: Read

async function R(key) {
    const { db, client } = await connectDB();
    const collection = db.collection('notes'); 
    const doc = await collection.findOne({ notekey: key });
    return {
        key: doc.notekey, title: doc.title, body: doc.body
    };
}

// For ES6 modules
export { R as read };
// For CommonJS modules
module.exports.read = R;

Here we are retrieving (reading) a document. Because findOne returns extra fields than we want to expose to the caller, we are returning a santized object.

{"_id":"5b0712884670064de8e02aa5","notekey":"foo","title":"Foo","body":"Foo"}

This is what is returned by the database driver. The _id field is an internal identifier which you may or may not want to expose to the caller.

{"key":"foo","title":"Foo","body":"Foo"}

This is what we are returning from the above function.

For documentation see: (mongodb.github.io) http://mongodb.github.io/node-mongodb-native/3.0/api/Collection.html#findOne

CRUD: Update

async function U(key, title, body) {
    const { db, client } = await connectDB();
    const collection = db.collection('notes');
    let result = await collection.updateOne({ notekey: key },
            { $set: { title, body } });
    return result;
}

// For ES6 modules
export { U as update };
// For CommonJS modules
module.exports.update = U;

Here we are updating an object in the MongoDB database. The first argument is a filter, which matches against documents in the collection. The second argument describes the update to make.

We are returning the results object which looks like so:

{"n":1,"nModified":1,"ok":1}

For documentation see: (mongodb.github.io) http://mongodb.github.io/node-mongodb-native/3.0/api/Collection.html#updateOne

CRUD: Delete

async function D(key) {
    const { db, client } = await connectDB();
    const collection = db.collection('notes');
    return await collection.findOneAndDelete({ notekey: key });
}

// For ES6 modules
export { D as destroy };
// For CommonJS modules
module.exports.destroy = D;

Here we are deleting a document from the MongoDB database. The findOneAndDelete function takes the same filter argument as used previously. As the name implies, it will delete the matching document from the database.

The module function name is exported as destroy because delete is a reserved word in JavaScript.

The return object will look something like this:

{
    "lastErrorObject":{"n":1},
    "value":{"_id":"5b0712884670064de8e02aa5","notekey":"foo","title":"FooER","body":"FooER"},
    "ok":1
}

It's interesting that the delete operation result includes the object which was deleted. This is as per the documentation at (mongodb.github.io) http://mongodb.github.io/node-mongodb-native/3.0/api/Collection.html#findOneAndDelete

Conclusion

This only touched the surface of how to interact with a MongoDB database from Node.js.

The general observation is that MongoDB driver functions all can be induced to return a Promise by leaving out the callback function. By returning a Promise, our code works great as an async function.

« Books and videos so you can easily learn Node.js programming How to deploy Express applications to AWS Lambda »
2016 Election Acer C720 Ad block AkashaCMS Amazon Amazon Kindle Amazon Web Services America Amiga Android Anti-Fascism AntiVirus Software Apple Apple Hardware History Apple iPhone Apple iPhone Hardware April 1st Arduino ARM Compilation Artificial Intelligence Astronomy Astrophotography Asynchronous Programming Authoritarianism Automated Social Posting AWS DynamoDB AWS Lambda Ayo.JS Bells Law Big Brother Big Finish Bitcoin Mining Black Holes Blade Runner Blockchain Blogger Blogging Books Botnet Botnets Cassette Tapes Cellphones China China Manufacturing Christopher Eccleston Chrome Chrome Apps Chromebook Chromebox ChromeOS CIA CitiCards Citizen Journalism Civil Liberties Clinton Cluster Computing Command Line Tools Comment Systems Computer Accessories Computer Hardware Computer Repair Computers Cross Compilation Crouton Cryptocurrency Curiosity Rover Currencies Cyber Security Cybermen Daleks Darth Vader Data backup Data Storage Database Database Backup Databases David Tenant DDoS Botnet Detect Adblocker Developers Editors Digital Photography Diskless Booting Disqus DIY DIY Repair DNP3 Do it yourself Docker Docker MAMP Docker Swarm Doctor Who Doctor Who Paradox Doctor Who Review Drobo Drupal Drupal Themes DVD E-Books E-Readers Early Computers Election Hacks Electric Bicycles Electric Vehicles Electron Emdebian Encabulators Energy Efficiency Enterprise Node EPUB ESP8266 Ethical Curation Eurovision Event Driven Asynchronous Express Face Recognition Facebook Fake News Fedora VirtualBox File transfer without iTunes FireFly Flickr Fraud Freedom of Speech Gallifrey git Github GitKraken Gitlab GMAIL Google Google Chrome Google Gnome Google+ Government Spying Great Britain Heat Loss Hibernate Hoax Science Home Automation HTTP Security HTTPS Human ID I2C Protocol Image Analysis Image Conversion Image Processing ImageMagick In-memory Computing InfluxDB Infrared Thermometers Insulation Internet Internet Advertising Internet Law Internet of Things Internet Policy Internet Privacy iOS Devices iPad iPhone iPhone hacking Iron Man iTunes Java JavaScript JavaScript Injection JDBC John Simms Journalism Joyent Kaspersky Labs Kindle Kindle Marketplace Lets Encrypt LibreOffice Linux Linux Hints Linux Single Board Computers Logging Mac Mini Mac OS Mac OS X Machine Learning Machine Readable ID macOS MacOS X setup Make Money Online March For Our Lives MariaDB Mars Mass Violence Matt Lucas MEADS Anti-Missile Mercurial MERN Stack Michele Gomez Micro Apartments Microsoft Military AI Military Hardware Minification Minimized CSS Minimized HTML Minimized JavaScript Missy Mobile Applications Mobile Computers MODBUS Mondas Monetary System MongoDB Mongoose Monty Python MQTT Music Player Music Streaming MySQL NanoPi Nardole NASA Net Neutrality Node Web Development Node.js Node.js Database Node.js Testing Node.JS Web Development Node.x North Korea npm NVIDIA NY Times Online advertising Online Community Online Fraud Online Journalism Online Photography Online Video Open Media Vault Open Source Open Source Governance Open Source Licenses Open Source Software OpenAPI OpenVPN Palmtop PDA Paywalls Personal Flight Peter Capaldi Photography PHP Plex Plex Media Server Political Protest Postal Service Power Control Privacy Production use Public Violence Raspberry Pi Raspberry Pi 3 Raspberry Pi Zero ReactJS Recaptcha Recycling Refurbished Computers Remote Desktop Removable Storage Republicans Retro Computing Retro-Technology Reviews RFID Right to Repair River Song Robotics Rocket Ships RSS News Readers rsync Russia Russia Troll Factory Russian Hacking Rust SCADA Scheme Science Fiction SD Cards Search Engine Ranking Season 1 Season 10 Season 11 Security Security Cameras Server-side JavaScript Serverless Framework Servers Shell Scripts Silence Simsimi Skype SmugMug Social Media Social Media Warfare Social Networks Software Development Space Flight Space Ship Reuse Space Ships SpaceX Spear Phishing Spring Spring Boot Spy Satellites SQLite3 SSD Drives SSD upgrade SSH SSH Key SSL Stand For Truth Strange Parts Swagger Synchronizing Files Telescopes Terrorism The Cybermen The Daleks The Master Time-Series Database Torchwood Total Information Awareness Trump Trump Administration Trump Campaign Ubuntu Udemy UDOO US Department of Defense Virtual Private Networks VirtualBox VLC VNC VOIP Vue.js Web Applications Web Developer Resources Web Development Web Development Tools Web Marketing Website Advertising Weeping Angels WhatsApp William Hartnell Window Insulation Windows Windows Alternatives Wordpress World Wide Web Yahoo YouTube YouTube Monetization