Skip to main content

Angular 7 upload File to Amazon S3 Bucket

Amazon Simple Storage Service (Amazon S3) is object storage built to store and retrieve any amount of data from web or mobile. Amazon S3 is designed to make web-scale computing easier for developers. In this tutorial, we’re gonna create an Angular 7 Application that can upload files to Amazon S3 Bucket.
Set up Amazon S3
Create an IAM user
We need to provide access permission bucket. So follow these step to create an IAM user and get Access key ID and Secret access key:
Go to https://console.aws.amazon.com/iam/
In the navigation pane, choose Users and then choose Add user.
Input User name, choose Programmatic access for Access type
Press Next: Permissions button -> go to Set permissions for jsa-user screen.
Now, choose Attach existing policies directly -> filter policy type s3, then check AmazonS3FullAccess
Press Next: Review
Press Create user
Press Download .csv for {Access key ID, Secret access key}.
Create Amazon S3 Bucket
e.aws.amazon.com/s3, click on Create bucket
Input information for creating bucket, then click on Create
Configure CORS for Bucket
Click on the Bucket we have just created
Choose Permission tab, then CORS Configuration
Configure CORS for Bucket, then click on Save button.
Create Project
Create a new angular 7 project using the following CLI command
Install AWS SDK
install the AWS SDK using the following npm command
Create a service
Create a service for handling file upload using the following CLI command
next, import the following dependencies in UploadService class file
Create a method for upload a file. Here is the complete code of the upload method.
configure your bucket like this
create upload parameters like this
upload your file using this code
Your app.component.html file like this
Here is the code of app.component.ts file
run this project using the following command
After uploading the file you have check the console log.
Then your bucket like this

Comments

Popular posts from this blog

4 Ways to Communicate Across Browser Tabs in Realtime

1. Local Storage Events You might have already used LocalStorage, which is accessible across Tabs within the same application origin. But do you know that it also supports events? You can use this feature to communicate across Browser Tabs, where other Tabs will receive the event once the storage is updated. For example, let’s say in one Tab, we execute the following JavaScript code. window.localStorage.setItem("loggedIn", "true"); The other Tabs which listen to the event will receive it, as shown below. window.addEventListener('storage', (event) => { if (event.storageArea != localStorage) return; if (event.key === 'loggedIn') { // Do something with event.newValue } }); 2. Broadcast Channel API The Broadcast Channel API allows communication between Tabs, Windows, Frames, Iframes, and  Web Workers . One Tab can create and post to a channel as follows. const channel = new BroadcastChannel('app-data'); channel.postMessage(data); And oth...

Certbot SSL configuration in ubuntu

  Introduction Let’s Encrypt is a Certificate Authority (CA) that provides an easy way to obtain and install free  TLS/SSL certificates , thereby enabling encrypted HTTPS on web servers. It simplifies the process by providing a software client, Certbot, that attempts to automate most (if not all) of the required steps. Currently, the entire process of obtaining and installing a certificate is fully automated on both Apache and Nginx. In this tutorial, you will use Certbot to obtain a free SSL certificate for Apache on Ubuntu 18.04 and set up your certificate to renew automatically. This tutorial will use a separate Apache virtual host file instead of the default configuration file.  We recommend  creating new Apache virtual host files for each domain because it helps to avoid common mistakes and maintains the default files as a fallback configuration. Prerequisites To follow this tutorial, you will need: One Ubuntu 18.04 server set up by following this  initial ...

Working with Node.js streams

  Introduction Streams are one of the major features that most Node.js applications rely on, especially when handling HTTP requests, reading/writing files, and making socket communications. Streams are very predictable since we can always expect data, error, and end events when using streams. This article will teach Node developers how to use streams to efficiently handle large amounts of data. This is a typical real-world challenge faced by Node developers when they have to deal with a large data source, and it may not be feasible to process this data all at once. This article will cover the following topics: Types of streams When to adopt Node.js streams Batching Composing streams in Node.js Transforming data with transform streams Piping streams Error handling Node.js streams Types of streams The following are four main types of streams in Node.js: Readable streams: The readable stream is responsible for reading data from a source file Writable streams: The writable stream is re...