Skip to main content

async/await is the wrong abstraction

There is no denying that the async....await pattern is super simple and has simplified asynchronous programming for developers more akin to server-side programming who feel a little insecure and scared without their comfort blanket of a try....catch block.

Our conscious mind or left-brain operates in what can be thought of as an abstraction of reality. The universe is an infinitesimal series of events happening simultaneously at the same time that our conscious mind cannot grasp, it thinks sequentially or linearly, and we process one thought at a time.

What we are trying to do with async....await is to ignore reality and have these async operations appear to be happening synchronously. To escape reality in this fashion is all great until it’s not.

Every so often I see a tweet from someone when they realize that async...await is fundamentally flawed for reasons that this post will explain……if you have not discovered this yourself


The first time I got hit by this realization was when I was working on a feature that allowed users to upload large video files into Azure blob storage. As these files were large and they had to be split into separate chunks. I was usingasync...await in a for...of loop. Then came the requirement that a user would like to cancel the upload halfway through. It was at that moment that this magical almost synchronous looking code block was not fit for purpose.

Cancelling a promise chain

There is no getting around it, and there is absolutely nothing to support cancellation in async...await. Below is a simple example of a dependant call chain:

async function updatePersonalCircumstances(token) {
  const data = await fetchData();
  const userData = await updateUserData(data);
  const userAddress = await updateUserAddress(userData);
  const financialStatus = await updateFinancialStatus(userAddress);
  
  return financialStatus;
}

const token = {};
const promise = updatePersonalCircumstances(token);

Here we have a classic promise chain with each call awaiting on the last. What if we want to cancel at updateUserAddress and not call updateFinancialStatus?

Now we have arrived at the point of the piece, are you sitting comfortably? Then let me spell it out…..

Once you go into an await call, you never come out unless the underlying promise either resolves or rejects.

A half baked solution

The only way that this chain can get cancelled is to wrap every singleasync..await call like this:

async function updatePersonalCircumstances(token) {
  let cancelled = false;

  // we can't reject, since we don't have access to
  // the returned promise
  token.cancel = () => {
    cancelled = true;
  };

  const data = await wrapWithCancel(fetchData)();
  const userData = await wrapWithCancel(updateUserData)(data);
  const userAddress = await wrapWithCancel(updateUserAddress)(userData);
  const financialStatus = await wrapWithCancel(updateFinancialStatus)(userAddress);

  // we check after each call to see if something has happend
  if (cancelled) {
    throw { reason: 'cancelled' };
  }

  return financialStatus;

  function wrapWithCancel(fn) {
    return data => {
      if (!cancelled) {
        return fn(data);
      }
    }
  }
}

const token = {};
const promise = updateUser(token);

token.cancel(); // abort!!!

Unfortunately, we need to check at every call to see if there has been a cancellation. We have pushed full responsibility to the user to do the right thing.

The generator renaissance

When I first encountered this problem, I was working on an angular project which has a dependency of RxJSRxJS observableshave first-class support for cancellation. The problem with rxjs, is the difficulty of getting up to speed with it, it is vast. I have forgotten most of what I have learned about rxjs observables but they were a really good fit for cancelation. If only JavaScript had native support for cancellation? Well, it sort of does.

I have recently discovered effection.js which came into being to cure this problem but has since pushed the boundaries of what is possible with generators.

With generators, you can return immediately or discard the generator if we want to cancel. With async...await it is effectively a black box with no such convenience.

Below is a better solution to canceling the promise chain:

function runner(fn, ...args) {
  const gen = fn(...args);
  let cancelled, cancel;
  const promise = new Promise((resolve, promiseReject) => {
    cancel = () => {
      cancelled = true;
      reject({ reason: 'cancelled' });
    };
    
    let value;

    onFulfilled();

    function onFulfilled(res) {
      if (!cancelled) {
        let result;
        try {
          result = gen.next(res);
        } catch (e) {
          return reject(e);
        }
        next(result);
        return null;
      }
    }

    function onRejected(err) {
      var result;
      try {
        result = gen.throw(err);
      } catch (e) {
        return reject(e);
      }
      next(result);
    }

    function next({ done, value }) {
      if (done) {
        return resolve(value);
      }
      return value.then(onFulfilled, onRejected);
    }
  });
  
  return { promise, cancel };
}

function* updatePersonalCircumstances() {
  const data = yield fetchData();
  const userData = yield updateUserData(data);
  const userAddress = yield updateUserAddress(userData);
  const financialStatus = yield updateFinancialStatus(userAddress);
  
  return financialStatus;
}

const { promise, cancel } = runner(updatePersonalCircumstances);

// cancel baby!!!
cancel();

The above code is a basic implementation of a more thorough example I link to at the end of this post. The key is the cancel function:

cancel = () => {
  cancelled = true;
  reject({ reason: 'cancelled' });
};

Calling cancel rejects the promise but the key to making this cancelable is the fact that the generator function is always in play. We could use the generator throw function as an abort signal to indicate a cancellation, or we could even use the generator’s return function to stop executing the promise chain.

The point I am making here is that the generator is always in play throughout the calling sequence and there is no such convenience in async...await.

Generators in the real world

I have created this more involved CodeSandbox which wraps this functionality into a React Hook. I have also used xstate to indicate the various state changes in an async request. Using a finite state machine gives the code a better abstraction to cling to and is superior to a pseudo blocking paradigm that has obvious limitations such as the villain of this article, namely async...await.

effection.js

I want to thank the frontside people for opening my eyes to the unmined gold that are JavaScript generators. The sky is the limit, and they can be used in any conceivable environment such as build tooling:

import { createConnection, Connection, ConnectionConfig } from 'mysql';
import { spawn, timeout, Operation } from 'effection';
import { main } from '@effection/node';

import { Deferred } from './deferred';

main(function* prepare(): Operation<void> {

  let connection: Connection = yield function* getConnection(): Operation<Connection> {
    // asynchronously wait for 10s and then raise an exception.
    // if a connection is created before the timeout, then this
    // operation will be cancelled automatically because the enclosing
    // operation returned.
    yield spawn(function*(): Operation<void> {
      yield timeout(10000);
      throw new Error('timeout out waiting 10s for mysql connection');
    });

    // Loop "forever" trying to repeatedly create a connection. Of
    // course it isn't forever, because this loop is racing against
    // the timeout.
    while (true) {
      try {
        return yield connect({
          user: "root",
          host: "localhost",
          port: 3306
        });
      } catch (error) {
        // if its a socket error or a MysqlError, we want to try again
        // otherwise, raise the exception
        if (!error.errno) {
          throw error;
        }
      }
    }
  }

  try {
    //now we have the connection and can query, migrate, etc...
  } finally {
    connection.destroy();
  }
});


/**
 * Create a mysql connection as an effection Operation.
 */
function* connect(config: ConnectionConfig): Operation<Connection> {
  let { resolve, reject, promise } = Deferred<Connection>();
  let connection = createConnection(config);

  connection.connect((err?: Error) => {
    if (err) {
      reject(err);
    } else {
      resolve(connection);
    }
  });

  return yield promise;
}

Check out effection to change your perspective.

Epilogue

I think we have settled for convenience over functionality. I still do use async..await and it is excellent for a one-call scenario, but I, and many others, have discovered it is minimal for more complex real-world situations.




Comments

Popular posts from this blog

4 Ways to Communicate Across Browser Tabs in Realtime

1. Local Storage Events You might have already used LocalStorage, which is accessible across Tabs within the same application origin. But do you know that it also supports events? You can use this feature to communicate across Browser Tabs, where other Tabs will receive the event once the storage is updated. For example, let’s say in one Tab, we execute the following JavaScript code. window.localStorage.setItem("loggedIn", "true"); The other Tabs which listen to the event will receive it, as shown below. window.addEventListener('storage', (event) => { if (event.storageArea != localStorage) return; if (event.key === 'loggedIn') { // Do something with event.newValue } }); 2. Broadcast Channel API The Broadcast Channel API allows communication between Tabs, Windows, Frames, Iframes, and  Web Workers . One Tab can create and post to a channel as follows. const channel = new BroadcastChannel('app-data'); channel.postMessage(data); And oth...

Certbot SSL configuration in ubuntu

  Introduction Let’s Encrypt is a Certificate Authority (CA) that provides an easy way to obtain and install free  TLS/SSL certificates , thereby enabling encrypted HTTPS on web servers. It simplifies the process by providing a software client, Certbot, that attempts to automate most (if not all) of the required steps. Currently, the entire process of obtaining and installing a certificate is fully automated on both Apache and Nginx. In this tutorial, you will use Certbot to obtain a free SSL certificate for Apache on Ubuntu 18.04 and set up your certificate to renew automatically. This tutorial will use a separate Apache virtual host file instead of the default configuration file.  We recommend  creating new Apache virtual host files for each domain because it helps to avoid common mistakes and maintains the default files as a fallback configuration. Prerequisites To follow this tutorial, you will need: One Ubuntu 18.04 server set up by following this  initial ...

Working with Node.js streams

  Introduction Streams are one of the major features that most Node.js applications rely on, especially when handling HTTP requests, reading/writing files, and making socket communications. Streams are very predictable since we can always expect data, error, and end events when using streams. This article will teach Node developers how to use streams to efficiently handle large amounts of data. This is a typical real-world challenge faced by Node developers when they have to deal with a large data source, and it may not be feasible to process this data all at once. This article will cover the following topics: Types of streams When to adopt Node.js streams Batching Composing streams in Node.js Transforming data with transform streams Piping streams Error handling Node.js streams Types of streams The following are four main types of streams in Node.js: Readable streams: The readable stream is responsible for reading data from a source file Writable streams: The writable stream is re...