Skip to main content

How To Run Single Command On Multiple Remote Systems At Once

PSSH, or Parallel SSH, is a command line suite that helps you to ssh in parallel on a number of hosts. PSSH suite consists of  the following commands:

  • pssh – SSH to multiple remote systems in parallel,
  • pscp – Copy files in parallel to a number of hosts,
  • prsync : Copy files in parallel to a number of hosts,
  • pnuke : Kill processes in parallel on a number of hosts,
  • pslurp : Copy files in parallel from a number of hosts.
In this tutorial, we will see how to execute a single command on multiple hosts at once using PSSH.

Install PSSH

We can easily install PSSH using PIP, a python package manager.

To install PIP on Arch Linux and its derivatives, run:

$ sudo pacman -S python-pip

On RHEL, Fedora, CentOS:

$ sudo yum install epel-release
$ sudo yum install python-pip

Or,

$ sudo dnf install epel-release
$ sudo dnf install python-pip

On Debian, Ubuntu, Linux Mint:

$ sudo apt-get install python-pip

For more details about managing python packages using PIP, refer the following link.

Once PIP installed, run the following command to install PSSH.

$ sudo pip install pssh

PSSH has been installed! Let us go ahead and see how to use it.

Run Single Command On Multiple Remote Systems At Once Using PSSH

Important: In order to use PSSH (for the purpose of this tutorial only), all your remote systems must have a common username with same password. Otherwise, this method won’t help. Say for example, I have already created an user named sk with password ostechnix on all my remote hosts. You should have a same user with same password on all your remote systems as well.

Now, let us see how to run a single command on multiple remote hosts using PSSH. Go to your local system where you want to run the command and create a text file called remotehosts.txt. You can name it as you wish.

$ vi remotehosts.txt

Add IP addresses of your remote hosts with port numbers one by one as exactly shown below.

192.168.1.103:22
192.168.1.104:22

Where, 192,168.1.103 and 192.168.1.104 are the IP addresses of my remote systems. 22 is the ssh port number. You need to mention the correct port number if you’ve already changed it. Also, make sure you can be able to access all remote hosts from your local system via ssh.

Now, let us check the uptime of both remote hosts from our local system. To do so, run:

$ pssh -h remotehosts.txt -l sk -A -i "uptime"

Here,

  • remotehosts.txt – Contains the IP addresses of both remote systems.
  • sk – the username of both remote systems

Enter the password of the user “sk”.

Sample output:

Warning: do not enter your password if anyone else has superuser
privileges or access to your account.
Password: 
[1] 20:51:15 [SUCCESS] 192.168.1.103:22
 20:50:50 up 8 min, 1 user, load average: 0.05, 0.11, 0.10
[2] 20:51:15 [SUCCESS] 192.168.1.104:22
 20:50:52 up 12 min, 1 user, load average: 0.00, 0.07, 0.12

As you see above, we have run the “uptime” command on two remote hosts and got the result in one go.


What about the kernel version? To check the installed version of both remote hosts, run:

$ pssh -h remotehosts.txt -l sk -A -i "uname -r"

Sample output:

Warning: do not enter your password if anyone else has superuser
privileges or access to your account.
Password: 
[1] 20:53:09 [SUCCESS] 192.168.1.103:22
3.10.0-327.22.2.el7.x86_64
[2] 20:53:09 [SUCCESS] 192.168.1.104:22
4.4.0-21-generic

Very cool, isn’t? Can we create a directory on both remote hosts at once? Yes, of course! To do so, run the following command:

$ pssh -h remotehosts.txt -l sk -A -i "mkdir dir1"

Similarly, you can do anything you want to do on multiple remote hosts from your local system using PSSH.

Important: Please be very careful while using PSSH. One bad command will perform simultaneously on multiple hosts and damage all hosts. So, be very careful while using this method in production. I suggest you to test this in a virtual machines. Once you’re familiar with PSSH, you can use it on production if you like to.

Comments

Popular posts from this blog

4 Ways to Communicate Across Browser Tabs in Realtime

1. Local Storage Events You might have already used LocalStorage, which is accessible across Tabs within the same application origin. But do you know that it also supports events? You can use this feature to communicate across Browser Tabs, where other Tabs will receive the event once the storage is updated. For example, let’s say in one Tab, we execute the following JavaScript code. window.localStorage.setItem("loggedIn", "true"); The other Tabs which listen to the event will receive it, as shown below. window.addEventListener('storage', (event) => { if (event.storageArea != localStorage) return; if (event.key === 'loggedIn') { // Do something with event.newValue } }); 2. Broadcast Channel API The Broadcast Channel API allows communication between Tabs, Windows, Frames, Iframes, and  Web Workers . One Tab can create and post to a channel as follows. const channel = new BroadcastChannel('app-data'); channel.postMessage(data); And oth...

Certbot SSL configuration in ubuntu

  Introduction Let’s Encrypt is a Certificate Authority (CA) that provides an easy way to obtain and install free  TLS/SSL certificates , thereby enabling encrypted HTTPS on web servers. It simplifies the process by providing a software client, Certbot, that attempts to automate most (if not all) of the required steps. Currently, the entire process of obtaining and installing a certificate is fully automated on both Apache and Nginx. In this tutorial, you will use Certbot to obtain a free SSL certificate for Apache on Ubuntu 18.04 and set up your certificate to renew automatically. This tutorial will use a separate Apache virtual host file instead of the default configuration file.  We recommend  creating new Apache virtual host files for each domain because it helps to avoid common mistakes and maintains the default files as a fallback configuration. Prerequisites To follow this tutorial, you will need: One Ubuntu 18.04 server set up by following this  initial ...

Working with Node.js streams

  Introduction Streams are one of the major features that most Node.js applications rely on, especially when handling HTTP requests, reading/writing files, and making socket communications. Streams are very predictable since we can always expect data, error, and end events when using streams. This article will teach Node developers how to use streams to efficiently handle large amounts of data. This is a typical real-world challenge faced by Node developers when they have to deal with a large data source, and it may not be feasible to process this data all at once. This article will cover the following topics: Types of streams When to adopt Node.js streams Batching Composing streams in Node.js Transforming data with transform streams Piping streams Error handling Node.js streams Types of streams The following are four main types of streams in Node.js: Readable streams: The readable stream is responsible for reading data from a source file Writable streams: The writable stream is re...