Skip to main content

How to deploy Deno applications to production

Prerequisites

  • Docker installed on your server and local machine
  • An Ubuntu server, you can get one from DigitalOcean
  • Basic understanding of git
  • Basic understanding of the command line interface

In this step, you will be creating a simple Deno application to display hello world. Unlike NodeJS, you do not need to run npm initto create a new application. You can simply create a TypeScript file and start coding away.

To begin, create a new directory named deno-deploy on your local machine, by running:

mkdir deno-deploy

Change directory to deno-deploy by running:

cd deno-deploy

Create a new file named server.ts within the deno-deploydirectory by running:

touch server.ts

Note: alternatively, you could use your favourite editor to create this file.

Open server.ts with your preferred editor and paste and save the following code snippet:

import { serve } from "https://deno.land/std@0.53.0/http/server.ts";

const server = serve({ port: 8000 });

console.log("Now serving on port 8000 🔥");

for await (const req of server) {
    req.respond({ body: "Hello World"});
}

The snippet above will create a Deno server and serves the content Hello World on port 8000.

Create a Docker configuration

Create a Dockerfile

In this step, you will set up the Docker configuration for your Deno application. First, you will create a new file named Dockerfile via the terminal by running:

touch Dockerfile

Open Dockerfile with your preferred text editor then paste and save the following snippet:

FROM hayd/deno:latest

EXPOSE 8000

WORKDIR /app

ADD . /app

RUN deno cache server.ts

CMD ["run", "--allow-net", "server.ts"]

Let’s break down what these lines in our Dockerfile will do when executed:

FROM hayd/deno:latest

Pulls the latest version of hayd/deno image from Docker hub.

EXPOSE 8000

Exposes port 8000 on our container when built.

WORKDIR /app

Makes the working directory /app in our container.

ADD . /app

Copies the content of the root directory into /app directory in your Docker container.

RUN deno cache server.ts

Compiles server.ts so that it doesn’t need to be compiled for each startup.

CMD [“run”, “–allow-net”, “server.ts”]

This will run the server.ts file and enables networking.

Create docker-compose.yml

In this step, you will be creating a docker-compose.yml file that will be used to piece everything together and serve our application. To create a docker-compose.yml file, run:

touch docker-compose.yml

Open docker-compose.yml with your preferred text editor then paste and save the following snippet:

version: '3'

services:
  web:
    build: .
    container_name: deno-deploy
    ports:
      - "8000:8000"

Let us break down what these lines in our docker-compose.ymlwill do when executed. version: '3' specifies the version of YAML contained in the file:

web:
    build: .
    container_name: deno-deploy
    ports:
      - "8000:8000"

This section contains the web service.

build: .

This indicates that the Dockerfile we intend to build is in the current directory.

container_name: deno-deploy

This will ensure that the container name on the build will be deno-deploy.

ports: - "3000:3000"

Will map the container port 8000 to the host server port 8000.

Build and run the container

To build your Docker container locally, run:

docker-compose up

docker compose in terminal

You can visit your application on http://localhost:8000 via your preferred web browser.

weblocal

Deploy to production

Push to GitHub

Docker makes it easy to quickly deploy applications anywhere. First, you will need to make your code available on git version control, a good provider is GitHub. Create a new repository named deno-deploy.
github

github 1

Open the terminal, while still in the deno-deploy directory. Run:

git init

This will initiate a new git repository. Next, stage all files by running:

git add .

Commit the staged files with the commit message "deno deploy":

git commit -m "deno deploy"

Push to the master branch by running:

git push -u origin master

This will push the codebase along with the Docker configuration to the master branch of your Github repository.

Deploy on the server

In this step, you will be making your source code available on the server and making it accessible via the internet.
SSH into your server:

ssh {SERVER_USER}@{SERVER_IP}
  • SERVER_USER is the user of the server
  • SERVER_IP is the IP address of the server

Clone the repository:

git clone https://github.com/{GITHUB_USERNAME}/deno-deploy.git

Note: GITHUB_USERNAME is your actual GitHub username

Change directory into the cloned repository:

cd deno-deploy

Execute the docker-compose command:

docker-compose up -d

Unlike how you executed docker-compose on your local machine, there is a -d flag which enables your docker container to run in detached mode. In simpler terms, it allows your docker container to run in the background.
detached

You will be able to visit your application on http://{SERVER_IP}:8000. In my case, you can visit http://104.248.172.220:8000.

production

Conclusion

In this article, we learned how to create a simple Deno web server, how to create a Docker configuration for a Deno web server, how to push your code to GitHub, and how to make your Deno application available on the internet. Happy coding!

Comments

Popular posts from this blog

4 Ways to Communicate Across Browser Tabs in Realtime

1. Local Storage Events You might have already used LocalStorage, which is accessible across Tabs within the same application origin. But do you know that it also supports events? You can use this feature to communicate across Browser Tabs, where other Tabs will receive the event once the storage is updated. For example, let’s say in one Tab, we execute the following JavaScript code. window.localStorage.setItem("loggedIn", "true"); The other Tabs which listen to the event will receive it, as shown below. window.addEventListener('storage', (event) => { if (event.storageArea != localStorage) return; if (event.key === 'loggedIn') { // Do something with event.newValue } }); 2. Broadcast Channel API The Broadcast Channel API allows communication between Tabs, Windows, Frames, Iframes, and  Web Workers . One Tab can create and post to a channel as follows. const channel = new BroadcastChannel('app-data'); channel.postMessage(data); And oth...

Certbot SSL configuration in ubuntu

  Introduction Let’s Encrypt is a Certificate Authority (CA) that provides an easy way to obtain and install free  TLS/SSL certificates , thereby enabling encrypted HTTPS on web servers. It simplifies the process by providing a software client, Certbot, that attempts to automate most (if not all) of the required steps. Currently, the entire process of obtaining and installing a certificate is fully automated on both Apache and Nginx. In this tutorial, you will use Certbot to obtain a free SSL certificate for Apache on Ubuntu 18.04 and set up your certificate to renew automatically. This tutorial will use a separate Apache virtual host file instead of the default configuration file.  We recommend  creating new Apache virtual host files for each domain because it helps to avoid common mistakes and maintains the default files as a fallback configuration. Prerequisites To follow this tutorial, you will need: One Ubuntu 18.04 server set up by following this  initial ...

Working with Node.js streams

  Introduction Streams are one of the major features that most Node.js applications rely on, especially when handling HTTP requests, reading/writing files, and making socket communications. Streams are very predictable since we can always expect data, error, and end events when using streams. This article will teach Node developers how to use streams to efficiently handle large amounts of data. This is a typical real-world challenge faced by Node developers when they have to deal with a large data source, and it may not be feasible to process this data all at once. This article will cover the following topics: Types of streams When to adopt Node.js streams Batching Composing streams in Node.js Transforming data with transform streams Piping streams Error handling Node.js streams Types of streams The following are four main types of streams in Node.js: Readable streams: The readable stream is responsible for reading data from a source file Writable streams: The writable stream is re...