Skip to main content

New and potential ES2019 JavaScript features every developer should be excited about

An overview of new ES2019 JavaScript features
JavaScript has come a long way since its early days, with many new additions and features designed specifically to make the language more user-friendly and less verbose. Below are some recent additions to JavaScript that I find fascinating.
Some of these features are already available in Node, Chrome, Firefox, and Safari, while others are still in the proposal stage.

Optional chaining

Optional chaining is done using the ?. operator. It primarily ensures that the preceding value before the question mark is neither undefined nor null. This is really useful when assessing the properties of deeply nested objects.
There’s a need to be sure that the ?. operator exists before assessing the properties.
Consider the following example:
const users = [
  {
   name: "Olagunju Gbolahan",
   occupation: "Software Developer",
   sayName(){
    console.log(`my name is ${this.name}`);
   },
   address: { office: "New York" }
  },
  { name: "Olawaseyi Moses" },
  { name: "Tunde Ednut" }
];
Let’s consider the second user in the array of users:
const secondUser = users[1];
We may want to get the office address of this user. Before the advent of the optional chaining operator, we would have to go through a relatively inefficient process to obtain this information:
const theAddress = secondUser.address && secondUser.address.office;
console.log(theAddress); // undefined
If we had a deeply nested object, we would have to check that its value existed using && operator on each level.
But with the optional chaining, we simply do the following:
const theAddress = secondUser?.address?.office;
console.log(theAddress); // undefined
We can also use optional chaining with object methods to confirm they exist before execution:
const firstUser = users[0];
console.log(firstUser.sayName?.()); // my name is Olagunju Gbolahan
It will simply return undefined if a method with the given name doesn’t exist on the object.
console.log(firstUser.sayOccupation?.()); // undefined
Because the optional chaining operator hasn’t been added to the JavaScript specification yet, it is still in the proposal stage.
You can use it today with the babel-plugin-proposal-optional-chaining plugin.

Optional catch binding

This feature comes in handy when we know beforehand what our error will be, and we don’t want the redundancy of unused variables.
Consider the traditional try and catch block:
try {
  const parsedJsonData = JSON.parse(obj);
} catch (error) {
  //the variable error has to be declared whether used or unused
  console.log(obj);
}
But with the addition of optional catch binding, we don’t have to provide unused variables — especially when we have defaults for our try block.
function getName () {
  let name = "Gbolahan Olagunju";
  try {
    name = obj.details.name
  } catch {}
  console.log(name);
}

The pipeline operator

This is one of the proposed additions to Javascript and it is currently at stage 1.
It essentially helps make several function calls to the same argument readable.
It does this by piping the value of an expression as argument(s) to a function. Consider calling the following functions without the pipeline operator |>.
const capitalize = (input) =>  input[0].toUpperCase() + input.substring(1);
const removeSpaces = (input) => input.trim();
const repeat = (input) => `${input}, ${input}`;
const withoutpipe = repeat(capitalize(removeSpaces('    i am gbols    ')));
console.log(withoutpipe); // I am gbols, I am gbols
But with the pipeline operator, readability can be greatly improved:
const withpipe = '    i am gbols    '
                |> removeSpaces
                |> capitalize
                |> repeat;
console.log(withpipe); // // I am gbols, I am gbols

String.trimStart and String.trimEnd

This was formally named trimRight and trimLeft, but with ES2019 the names were changed to the aliases trimStart and trimEnd to make them more intuitive to users.
Consider the following example:
let message = "     Welcome to LogRocket      ";
message.trimStart(); // "Welcome to LogRocket      "
message.trimEnd(); // "Welcome to LogRocket";

Object.fromEntries

Before talking about Object.fromEntries, it is important to talk about Object.entries.
The Object.entries method was added to the ES2017 specification to provide a way to convert an object into its array equivalent, thus granting it access to all array methods for processing.
Consider the following object:
const devs = {
  gbols: 5,
  andrew: 3,
  kelani: 10,
  dafe: 8,
};
const arrOfDevs = Object.entries(devs);
console.log(arrOfDevs);
//[
//  ["gbols", 5]
//  ["andrew", 3]
//  ["kelani", 10]
//  ["dafe", 8]
//]
Now, we can use the filter method on arrays to get devs that have more than 5 years experience:
const expDevs = arrOfDevs.filter(([name, yrsOfExp]) => yrsOfExp > 5);
console.log(expDevs);
//[
//  ["kelani", 10]
//  ["dafe", 8]
//]
Then a problem arises: there is no easy way to convert the results back into an object. Usually, we would write our own code to transform this back into an object:
const expDevsObj = {};
for (let [name, yrsOfExp] of expDevs) {
expDevsObj[name] = yrsOfExp;
}
console.log(expDevsObj);
//{
 //dafe: 8
 //kelani: 10
//}
But with the introduction of Object.fromEntries, we can do this in one swipe:
console.log(Object.fromEntries(expDevs));
//{
 //dafe: 8
 //kelani: 10
//}

Flat

Oftentimes, we have deeply nested arrays to deal with as a result of an API call. In this case, it’s especially important to flatten the array.
Consider the following example:
const developers = [
  {
    name: 'Gbolahan Olagunju',
    yrsOfExp: 6,
    stacks: ['Javascript', 'NodeJs', ['ReactJs', ['ExpressJs', 'PostgresSql']]]
  },
  {
    name: 'Daniel Show',
    yrsOfExp: 2,
    stacks: ['Ruby', 'Jest', ['Rails', ['JQuery', 'MySql']]]
  },
  {
    name: 'Edafe Emunotor',
    yrsOfExp: 9,
    stacks: ['PHP', 'Lumen', ['Angular', 'NgRx']]
  }
];
const allStacks = developers.map(({stacks}) => stacks);
console.log(allStacks);
// [
// ['Javascript', 'NodeJs', ['ReactJs', ['ExpressJs', 'PostgresSql']]]
// ['Ruby', 'Jest', ['Rails', ['JQuery', 'MySql']]]
// ['PHP', 'Lumen', ['Angular', 'NgRx']]
// ]
The allstacks variable contains deeply nested arrays. To flatten this array, we can use the Array.prototype.flat.
Here’s how:
const flatSingle = allStacks.flat();
console.log(flatSingle);
//[
// "JavaScript",
//  "NodeJs",
// ['ReactJs', ['ExpressJs', 'PostgresSql']]]
// "Ruby",
// "Jest",
// ['Rails', ['JQuery', 'MySql']]]
// "PHP",
// "Lumen"
// ["Angular", "NgRx"]
//]
We can deduce from the above that the array has been flattened one level deep, which is the default argument to array.prototype.flat.
We can pass an argument to the flat method to determine the degree to which we want to flatten.
The defaults argument is a value of 1. To completely flatten the array, we can pass an argument of Infinity. The Infinityargument flattens the array completely, irrespective of the depth of the array.
Here’s how:
const completelyFlat = allStacks.flat(Infinity);
console.log(completelyFlat);
//[
// "JavaScript",
// "NodeJs",
// "ReactJs",
// "ExpressJs",
// "PostgresSql",
// "Ruby",
// "Jest",
// "Rails",
// "JQuery",
// "MySql",
// "PHP",
// "Lumen",
// "Angular",
// "NgRx"
//]

FlatMap

FlatMap is a combination of calling the map method and the flat method with a depth of 1. It is often quite useful as it does the same thing in a very efficient manner.
Below is a simple example of using both map and flatMap:
let arr = ['my name is Gbols', ' ', 'and i am great developer']; 
console.log(arr.map(word => word.split(' ')));
//[
// ["my", "name", "is", "Gbols"],
// ["", ""],
// ["and", "i", "am", "great", "developer"]
//]
console.log(arr.flatMap(word => word.split(' ')));
//[ "my"
//  "name"
//  "is"
//  "Gbols"
//   ""
//   ""
//   "and"
//   "i"
//   "am"
//   "great"
//   "developer"
//]

Implementing new JS features? Understand how JavaScript errors affect your users.

Comments

Popular posts from this blog

4 Ways to Communicate Across Browser Tabs in Realtime

1. Local Storage Events You might have already used LocalStorage, which is accessible across Tabs within the same application origin. But do you know that it also supports events? You can use this feature to communicate across Browser Tabs, where other Tabs will receive the event once the storage is updated. For example, let’s say in one Tab, we execute the following JavaScript code. window.localStorage.setItem("loggedIn", "true"); The other Tabs which listen to the event will receive it, as shown below. window.addEventListener('storage', (event) => { if (event.storageArea != localStorage) return; if (event.key === 'loggedIn') { // Do something with event.newValue } }); 2. Broadcast Channel API The Broadcast Channel API allows communication between Tabs, Windows, Frames, Iframes, and  Web Workers . One Tab can create and post to a channel as follows. const channel = new BroadcastChannel('app-data'); channel.postMessage(data); And oth...

Certbot SSL configuration in ubuntu

  Introduction Let’s Encrypt is a Certificate Authority (CA) that provides an easy way to obtain and install free  TLS/SSL certificates , thereby enabling encrypted HTTPS on web servers. It simplifies the process by providing a software client, Certbot, that attempts to automate most (if not all) of the required steps. Currently, the entire process of obtaining and installing a certificate is fully automated on both Apache and Nginx. In this tutorial, you will use Certbot to obtain a free SSL certificate for Apache on Ubuntu 18.04 and set up your certificate to renew automatically. This tutorial will use a separate Apache virtual host file instead of the default configuration file.  We recommend  creating new Apache virtual host files for each domain because it helps to avoid common mistakes and maintains the default files as a fallback configuration. Prerequisites To follow this tutorial, you will need: One Ubuntu 18.04 server set up by following this  initial ...

Working with Node.js streams

  Introduction Streams are one of the major features that most Node.js applications rely on, especially when handling HTTP requests, reading/writing files, and making socket communications. Streams are very predictable since we can always expect data, error, and end events when using streams. This article will teach Node developers how to use streams to efficiently handle large amounts of data. This is a typical real-world challenge faced by Node developers when they have to deal with a large data source, and it may not be feasible to process this data all at once. This article will cover the following topics: Types of streams When to adopt Node.js streams Batching Composing streams in Node.js Transforming data with transform streams Piping streams Error handling Node.js streams Types of streams The following are four main types of streams in Node.js: Readable streams: The readable stream is responsible for reading data from a source file Writable streams: The writable stream is re...