Skip to main content

How to download an entire directory from an UNIX server with SSH2 in Node.js

Nowadays, most of the UNIX based systems come with several basic backup software options, including dd, cpio, tar, and dump. If the basic backup software of your server does not meet your needs, then there are a lot of options for backup software on internet that you may consider to check out. However, you can still be an old-fashioned programmer (maybe more cautious) and create a home-made backup with Node.js by downloading an entire folder from your webserver and storing it inside a hard-drive in your home or office.

Requirements

In this article, we won't going to handle the download of every single folder inside a directory, primary because that will increase the processing time of the transfer and because it's pretty tricky and painful. To make the backup so simple as possible, we are going to compress directly a folder remotely. For this, we will need:

  • The command line utility tar for collecting many files into one archive file (available by default on any UNIX based server).

Note

The script can work on Windows Servers, however the tar utility should be available and instead of redirect the output to 2>/dev/null, it should be NULinstead.

Obviously you need Node.js installed and some patience. Let's get started !

1. Install tar-fs and ssh2 in your Node.js project

Open a new terminal in your computer and switch to the directory of your project. Then proceed to install the SSH2 module with the following command:

npm install ssh2

If you need more information about the SSH2 module, please visit the official repository in Github here.

Once the SSH2 module is installed, you will need now the tar-fs module, tar-fs allows you to pack directories into tarballs and extract tarballs into directories. Install the module with the following command:

npm install tar-fs

If you need more information about the tar-fs module, please visit the official repository in Github here.

After the installation of the modules, you can write some code that will download an entire directory from your server locally as a backup.

2. Create the transfer function

In order to test the script, create a demo file, namely backup.js and save the following script inside. The following function works like this: using the connection object from the SSH2 library, a command will be executed in the remote terminal (something like tar cf - /folder/to/download 2>/dev/nulland if you use compression tar cf - /folder/to/download 2>/dev/null | gzip 6 -c 2>/dev/null), this command will stream all the compressed files and folders to our connection (redirects the standard output (stdout) to /dev/null, which discards it, treated as black hole in Linux/Unix) and with the help of the tar-fs module, you will be able to extract the streamed data into some local directory.

Note that you need to require previously the tar-fs and zlib module:

var tar = require('tar-fs');
var zlib = require('zlib');

/**
 * Transfers an entire directory locally by compressing, downloading and extracting it locally.
 * 
 * @param {SSH} conn A ssh connection of the ssh2 library
 * @param {String} remotePath 
 * @param {String} localPath 
 * @param {Integer|Boolean} compression 
 * @param {Function} cb Callback executed once the transfer finishes (success or error)
 * @see http://stackoverflow.com/questions/23935283/transfer-entire-directory-using-ssh2-in-nodejs
 */
function transferDirectory(conn, remotePath, localPath, compression, cb) {
    var cmd = 'tar cf - "' + remotePath + '" 2>/dev/null';

    if (typeof compression === 'function'){
        cb = compression;
    }else if (compression === true){
        compression = 6;
    }

    // Apply compression if desired
    if (typeof compression === 'number' && compression >= 1 && compression <= 9){
        cmd += ' | gzip -' + compression + 'c 2>/dev/null';
    }else{
        compression = undefined;
    }

    conn.exec(cmd, function (err, stream) {
        if (err){
            return cb(err);
        }
            
        var exitErr;

        var tarStream = tar.extract(localPath);

        tarStream.on('finish', function () {
            cb(exitErr);
        });

        stream.on('exit', function (code, signal) {
            
            if (typeof code === 'number' && code !== 0){
                exitErr = new Error('Remote process exited with code ' + code);
            }else if (signal){
                exitErr = new Error('Remote process killed with signal ' + signal);                
            }
                
        }).stderr.resume();

        if (compression){
            stream = stream.pipe(zlib.createGunzip());
        }

        stream.pipe(tarStream);
    });
}

With this single snippet, you will be ready to download a directory.

3. Download a directory

To start the backup of an entire directory from your server, you will need to create a new SSH connection with the ssh2 module. Require the module and create a new instance of it, then configure the settings object (credentials, URL of your server etc.) and add the ready listener. Inside the callback you need to execute the transfer function created in the previous step. Set the parameters as required and verify your script:

var SSH = require('ssh2');

var conn = new SSH();

var connectionSettings = {
    // The host URL
    host: 'your.server.url.com',
    // The port, usually 22
    port: 22,
    // Credentials
    username: 'root',
    password: '*******'
};

conn.on('ready', function () {
    // Use the transfer directory 
    transferDirectory(
        // The SSH2 connection
        conn,
        // The remote folder of your unix server that you want to back up
        '/var/www/vhosts/yourproject.com/some-folder-to-backup',
        // Local path where the files should be saved
        __dirname + '/backup',
        // Define a compression value (true for default 6) with a numerical value
        true,
        // A callback executed once the transference finishes
        function (err) {
            if (err){
                throw err;
            };

            console.log('Remote directory succesfully downloaded!');

            // Finish the connection
            conn.end();
        }
    );
}).connect(connectionSettings);

Then save the changes on your file and execute the script with:

node backup.js

Once the script finishes, you should get a new folder (backup) in the same folder where the script is located. This script is really useful if you want to create backups from your projects, specifically files that the user uploads in projects like WordPress or other kind of CMS.

Happy coding !

Comments

Popular posts from this blog

How to use Ngx-Charts in Angular ?

Charts helps us to visualize large amount of data in an easy to understand and interactive way. This helps businesses to grow more by taking important decisions from the data. For example, e-commerce can have charts or reports for product sales, with various categories like product type, year, etc. In angular, we have various charting libraries to create charts.  Ngx-charts  is one of them. Check out the list of  best angular chart libraries .  In this article, we will see data visualization with ngx-charts and how to use ngx-charts in angular application ? We will see, How to install ngx-charts in angular ? Create a vertical bar chart Create a pie chart, advanced pie chart and pie chart grid Introduction ngx-charts  is an open-source and declarative charting framework for angular2+. It is maintained by  Swimlane . It is using Angular to render and animate the SVG elements with all of its binding and speed goodness and uses d3 for the excellent math functio...

Understand Angular’s forRoot and forChild

  forRoot   /   forChild   is a pattern for singleton services that most of us know from routing. Routing is actually the main use case for it and as it is not commonly used outside of it, I wouldn’t be surprised if most Angular developers haven’t given it a second thought. However, as the official Angular documentation puts it: “Understanding how  forRoot()  works to make sure a service is a singleton will inform your development at a deeper level.” So let’s go. Providers & Injectors Angular comes with a dependency injection (DI) mechanism. When a component depends on a service, you don’t manually create an instance of the service. You  inject  the service and the dependency injection system takes care of providing an instance. import { Component, OnInit } from '@angular/core'; import { TestService } from 'src/app/services/test.service'; @Component({ selector: 'app-test', templateUrl: './test.component.html', styleUrls: ['./test.compon...

How to solve Puppeteer TimeoutError: Navigation timeout of 30000 ms exceeded

During the automation of multiple tasks on my job and personal projects, i decided to move on  Puppeteer  instead of the old school PhantomJS. One of the most usual problems with pages that contain a lot of content, because of the ads, images etc. is the load time, an exception is thrown (specifically the TimeoutError) after a page takes more than 30000ms (30 seconds) to load totally. To solve this problem, you will have 2 options, either to increase this timeout in the configuration or remove it at all. Personally, i prefer to remove the limit as i know that the pages that i work with will end up loading someday. In this article, i'll explain you briefly 2 ways to bypass this limitation. A. Globally on the tab The option that i prefer, as i browse multiple pages in the same tab, is to remove the timeout limit on the tab that i use to browse. For example, to remove the limit you should add: await page . setDefaultNavigationTimeout ( 0 ) ;  COPY SNIPPET The setDefaultNav...