How to write files to AWS S3 from Netlify functions

Attaching a file system to Netlify

Published on 2019-11-4


How to write files to AWS S3 from Netlify functions

This article shows how to read and write files to an Amazon S3 bucket from Netlify Functions.

I am using a service which registers what the users of one of my web applications are doing. This allows me to keep an eye on usage. The web application is supposed to be used once a week by every user. Tracking the usage allows me to see which users are likely to churn out. The service I'm using only keeps the event history around for the last three months. As I want to be able to compare the usage over the years, I decided to export the events once a week.

The web application is hosted on Netlify. I've set up Zapier to call my Netlify function responsible for tracking the events once a week. The Netlify function then gathers all events of that week and stores them in an Amazon S3 bucket.

This tutorial will outline the steps necessary to upload files to an Amazon AWS S3 bucket.

Setting up a bucket

First you need to create a bucket which your Netlify function will use. Open the AWS Console and select the S3 service.

Here, you can create a bucket using the UI. Enter a name and select a region. Press Next. Here, in the second step, you can chose whether the bucket's contents should be accessible by the whole world, or you can make them private (the default). I set mine to private as I'm going to use the bucket for private historical event data.

Press Next again to advance to the last step. Double-check the configuration and create the bucket.

Creating an IAM user

We need to create a user whose credentials we'll use to access the bucket. You can not use your own user account, as that would be unsafe.

Open the IAM service in the AWS Console. Select Users and then click Add User to add a new user.

Give the user a name you'll recognize (like "event-data-uploader"). For the Access type, you need to select Programmatic access. This enables you to download the credentials for this use, and to use them from Netlify functions.

Click Next: Permissions to continue. You now need to give the newly created user access to your bucket. In my case the user must be able to read and write files to AWS S3, so I've chosen Attach existing policies. There, I selected AmazonS3FullAccess to grant full access to Amazon S3.

Click Next: Tags to continue. We don't need to set any tags, so we can continue with Next: Review. Now create the user by pressing Create user.

In the next screen, you'll see a table showing the users credentials. You can also download a .csv file containing the credentails. Either download that file, or copy the users Access key ID and Secret access key as we'll need them later on.

Press Close to complete the creation.

Adding the credentials to Netlify

We prepared an Amazon S3 bucket, and created a user with programmatic access to that bucket. It's time to prepare Netlify.

We'll first store that user credentials as environment variables in Netlify. We can later read that information from our Netlify function.

Open your site's build settings in Netlify (Your site > Settings > Build & deploy > Environment). This is the place to edit environment variables. The environment variables are available during the build process of your site, and more importantly during the execution of your Netlify functions.

Create a new environment variable called MY_AWS_ACCESS_KEY_ID and provide the "Access key ID" from the previous step. Next, add another environment variable called MY_AWS_SECRET_ACCESS_KEY and set its value to the value of "Secret access key" from above.

Note that I've prefixed the environment variables with MY_. Since Netlify seems to use AWS as well, AWS_ACCESS_KEY_ID is not available as an environment variable name. The same applies to AWS_SECRET_ACCESS_KEY.

Save the environemnt variables using the Save button and we're ready for the final step.

In your Netlify function

Add the aws-sdk npm package to your project using npm install aws-sdk --save or yarn add aws-sdk.

The next step is to import the aws-sdk package into your function.

const AWS = require("aws-sdk");

You can now instantiate an S3 client with the credentials from the environment.

const s3 = new AWS.S3({
  accessKeyId: process.env.MY_AWS_ACCESS_KEY_ID,
  secretAccessKey: process.env.MY_AWS_SECRET_ACCESS_KEY,
});

To upload a file, you can use the following function:

const params = {
  // The bucket name
  Bucket: "name-of-your-bucket",
  // The key/name of your file
  Key: `name-of-your-file.json`,
  // The contents of your file
  Body: JSON.stringify({ hello: "world" }),
  // The access control
  ACL: "private",
  ContentEncoding: "utf8", // required
  ContentType: `application/json`,
};

s3.upload(params, (err, data) => resolve([err, data]));

The full API documentation is available here.

I've put it all together so you can see the final function:

const AWS = require("aws-sdk");

exports.handler = async (event, context) => {
  const s3 = new AWS.S3({
    accessKeyId: process.env.MY_AWS_ACCESS_KEY_ID,
    secretAccessKey: process.env.MY_AWS_SECRET_ACCESS_KEY,
  });

  try {
    const result = await s3
      .upload({
        // The bucket name
        Bucket: "name-of-your-bucket",
        // The key/name of your file
        Key: `name-of-your-file.json`,
        // The contents of your file
        Body: JSON.stringify({ hello: "world" }),
        // The access control
        ACL: "private",
        ContentEncoding: "utf8", // required
        ContentType: `application/json`,
      })
      .promise();

    if (error) return { statusCode: 500, body: JSON.stringify(error) };

    return { statusCode: 200, body: JSON.stringify(result) };
  } catch (e) {
    return { statusCode: 500, body: e.message };
  }
};

Now that everything is set up, reading a file is convenient as well. Below is an example function which reads the contents of a file from your bucket and returns them through the Netlify function.

const AWS = require("aws-sdk");

// This function reads the contents of `name-of-your-file.json` from an
// AWS S3 bucket and returns the contents.
exports.handler = async (event, context) => {
  const s3 = new AWS.S3({
    accessKeyId: process.env.MY_AWS_ACCESS_KEY_ID,
    secretAccessKey: process.env.MY_AWS_SECRET_ACCESS_KEY,
  });

  try {
    const result = await s3.getObject({
      // The bucket name
      Bucket: "name-of-your-bucket",
      // The key/name of your file
      Key: `name-of-your-file.json`,
    });

    if (error) return { statusCode: 500, body: JSON.stringify(error) };

    return {
      statusCode: 200,
      headers: {
        "content-type": result ? result.ContentType : "application/json",
      },
      body: JSON.stringify({ ...result, Body: result.Body.toString("utf-8") }),
    };
  } catch (e) {
    return { statusCode: 500, body: e.message };
  }
};