Easily deploy files or directory hierarchies to a server using Grunt

By: (plus.google.com) +David Herron; Date: 2015-03-03 12:36

Tags: Node.JS » Web Development

Something we geeks need to do all the time is deploy files between machines. Such as, deploying a directory hierarchy over to a server for staging or production use. There's a ton of ways to do this. The old-school way is a shell script with carefully crafted rsync commands. In my case I build websites using (akashacms.com) AkashaCMS and need to deploy them to the destination webserver. Until now I'd added a command to AkashaCMS solely for deployment, but I'm experimenting with Grunt to see how much can be done using the Grunt ecosystem rather than having to maintain code in AkashaCMS.

In AkashaCMS I simply used the Node.js spawn function to run an rsync command with selected command line options, some of which came out of the config file:

$ rsync --archive --delete --verbose --compress localDir/ user@remotehost.com:remoteDir/

One thing I'd like to avoid is that this creates a dependency on Unix/Linux/MacOSX systems that have rsync. Those poor people who have to suffer through using Windows don't have rsync. Let's all feel sorry for them.

Unfortunately the solution I've ended up with doesn't solve the rsync-on-windows problem.


The first tool I tried could possibly solve that problem, but I wasn't able to work out how to get it to run at a decent speed. But here goes anyway. In your Gruntfile.js load the grunt-sftp-deploy plugin as so


And make sure it's installed

$ npm install grunt-sftp-deploy

And/Or add it to your package.json to make sure the plugin is always installed.

The next step of course is to configure the plugin.

module.exports = function(grunt) {

       'sftp-deploy': {
            deploy: {
                auth: {
                    host: 'example.com',
                    port: 22,
                    authKey: process.env.HOME +"/.sftp-deploy-example.txt"
                cache: 'sftpCache.json',
                src: 'source-directory-name',
                dest: 'destination-directory-name',  // this is on the remote host
                exclusions: [],
                serverSep: '/',
                concurrency: 4,
                progress: true

This sets up a copying from a local directory to one on a remote host. The remove host name is in the host parameter, with the remote directory name in the dest parameter, and so on. That part is pretty straight-forward.

The tricky part is the authentication. Since this is a file which gets checked into source control it's a really bad idea to put user names or passwords in the Gruntfile. Fortunately Grunt makes it easy to read stuff in from external files, that you hopefully don't put under source control.

In this case the grunt-sftp-deploy plugin uses this authKey parameter in several ways, one of which is a file name containing a JSON object. It should be like so:

"username": "user-name-on-server"

Then grunt-sftp-deploy also automatically looks for local SSH keys to use for passwordless authentication.

This works pretty slick and you can then type this command:

$ grunt sftp-deploy:deploy

The only problem is that it takes a long time because it appears to deploy every last file. Supposedly the cache parameter is for a file that's used to keep data to help avoid uploading files which haven't changed. But that didn't work for me.

grunt-ssh, with .zip file

My next thought was to build a .zip file of the directory hierarchy I wanted to deploy, then use the grunt-ssh plugin to upload it, then ssh over a command to unpack the .zip file. This didn't work out because of the time involved in copying the .zip file. But let's take a look at this anyway.

To create the .zip file is something like this (using the archiver module):

var archiver = require('archiver');

module.exports.zipRenderedSite = function(config, done) {

    var archive = archiver('zip');

    var output = fs.createWriteStream(config.root_out +'.zip');

    output.on('close', function() {
        logger.info(archive.pointer() + ' total bytes');
        logger.info('archiver has been finalized and the output file descriptor has closed.');  

    archive.on('error', function(err) {


	archive.directory(config.root_out, ".");


That was easy.

Then in the Gruntfile.js:


To load the plugin tasks.

    deployData: grunt.file.readJSON(process.env.HOME +'/.sftp-deploy-example.json'),
    sftp: {
            deployZip: {
                files: { "./": config.root_out +'.zip' }
            options: {
                host: '<%= deployData.host %>',
                path: '<%= deployData.path %>',
                username: '<%= deployData.username %>',
                privateKey: grunt.file.read(process.env.HOME + "/.ssh/id_rsa"),
                showProgress: true

This again uses a JSON file in the home directory to store info that shouldn't be checked into source code control. The SSH private key is read directly.

This works without any external dependencies. I was going to take the next step to unpack the .zip file on the remote host, but it took too long to upload that file (over 200MB) that this too was a non-starter.


That left using an rsync wrapper, since I was running out of options that use a pure JavaScript SSH2 implementation.


This loads the plugin.

rsync: {
    deploySite: {
        options: {
            args: [ '--verbose', '--archive', '--delete', '--compress' ],
            src: config.root_out +'/',
            dest: "example.com/",
            host: "remote-user-name@example.com"

This is a simple wrapper around the rsync command. The args array are the exact same args you'd pass on the command line. The src and dest directories are exactly as you'd do on the command line. You have to take the exact same precise care about when and where to put a trailing / on the directory names, just as in rsync.

With this deployment is now this easy:

$ grunt rsync:deploySite
« Safely detect if a Node.js module is installed before using require() to load it Don't rip your hair out when Vows tells you "Errored callback not fired" - try this one weird trick instead »
2016 Election Acer C720 Ad block AkashaCMS Amazon Amazon Kindle Amiga Android Anti-Fascism AntiVirus Software Apple Apple Hardware History Apple iPhone Apple iPhone Hardware April 1st Arduino ARM Compilation Astronomy Asynchronous Programming Authoritarianism Automated Social Posting Ayo.JS Bells Law Big Brother Big Finish Bitcoin Mining Black Holes Blade Runner Blockchain Blogger Blogging Books Botnet Botnets Cassette Tapes Cellphones Christopher Eccleston Chrome Chrome Apps Chromebook Chromebooks Chromebox ChromeOS CIA CitiCards Citizen Journalism Civil Liberties Clinton Cluster Computing Command Line Tools Computer Hardware Computer Repair Computers Cross Compilation Crouton Cryptocurrency Curiosity Rover Cyber Security Cybermen Daleks Darth Vader Data backup Data Storage Database Database Backup Databases David Tenant DDoS Botnet Detect Adblocker Developers Editors Digital Photography Diskless Booting DIY DIY Repair DNP3 Do it yourself Docker Docker Swarm Doctor Who Doctor Who Paradox Drobo Drupal Drupal Themes DVD E-Books E-Readers Early Computers Election Hacks Electric Bicycles Electric Vehicles Electron Emdebian Encabulators Energy Efficiency Enterprise Node EPUB ESP8266 Ethical Curation Eurovision Event Driven Asynchronous Express Facebook Fake News Fedora VirtualBox File transfer without iTunes FireFly Fraud Freedom of Speech Gallifrey git Gitlab GMAIL Google Google Chrome Google Gnome Google+ Government Spying Great Britain Heat Loss Hibernate Hoax Science Home Automation HTTPS Human ID I2C Protocol Image Analysis Image Conversion Image Processing ImageMagick InfluxDB Infrared Thermometers Insulation Internet Internet Advertising Internet Law Internet of Things Internet Policy Internet Privacy iOS Devices iPad iPhone iPhone hacking Iron Man Iternet of Things iTunes Java JavaScript JavaScript Injection JDBC John Simms Journalism Joyent Kaspersky Labs Kindle Marketplace Lets Encrypt LibreOffice Linux Linux Hints Linux Single Board Computers Logging Mac OS Mac OS X Machine Readable ID MacOS X setup Make Money Online MariaDB Mars Matt Lucas MEADS Anti-Missile Mercurial Michele Gomez Micro Apartments Military Hardware Minification Minimized CSS Minimized HTML Minimized JavaScript Missy Mobile Applications MODBUS Mondas MongoDB Mongoose Monty Python MQTT Music Player Music Streaming MySQL NanoPi Nardole NASA Net Neutrality Node Web Development Node.js Node.js Database Node.js Testing Node.JS Web Development Node.x North Korea Online advertising Online Fraud Online Journalism Online Video Open Media Vault Open Source Governance Open Source Licenses Open Source Software OpenAPI OpenVPN Personal Flight Peter Capaldi Photography PHP Plex Plex Media Server Political Protest Postal Service Power Control Privacy Production use Public Violence Raspberry Pi Raspberry Pi 3 Raspberry Pi Zero Recycling Remote Desktop Republicans Retro-Technology Reviews Right to Repair River Song Robotics Rocket Ships RSS News Readers rsync Russia Russia Troll Factory Russian Hacking SCADA Scheme Science Fiction Search Engine Ranking Season 1 Season 10 Season 11 Security Security Cameras Server-side JavaScript Shell Scripts Silence Simsimi Skype Social Media Social Media Warfare Social Networks Software Development Space Flight Space Ship Reuse Space Ships SpaceX Spear Phishing Spring Spring Boot SQLite3 SSD Drives SSD upgrade SSH SSH Key SSL Swagger Synchronizing Files Telescopes Terrorism The Cybermen The Daleks The Master Time-Series Database Torchwood Total Information Awareness Trump Trump Administration Trump Campaign Ubuntu UDOO Virtual Private Networks VirtualBox VLC VNC VOIP Web Applications Web Developer Resources Web Development Web Development Tools Web Marketing Website Advertising Weeping Angels WhatsApp Window Insulation Wordpress YouTube YouTube Monetization