How to use Google Cloud Platform from Netlify

GCP + Netlify = ❤

Published on 2019-11-4


How to use Google Cloud Platform from Netlify

This article shows how to use Google Cloud Platform products from Netlify Functions on the example of Google's BigQuery.

I was looking for convenient way to use BigQuery from a Netlify function. I read the Google Cloud Platform documentation, but could not find any hint. I googled and googled and googled and could not find anything. It seemed like I was the first person to ever attempt this. I tried different suggested approaches, but nothing worked. If I would ever find out how to do this, I promised to share the result with the world. And then I found out!

Here are the steps to use Google Cloud Platform from a Netlify function, on the example of Google BigQuery.

Enabling the APIs

First you need to enable the APIs in Google Cloud Platform (GCP). Open the GCP Console. Then open the navigation under Navigation > APIs & Services > Dashboard. There should be a button labeled Enable APIs and Services. Click that.

You can search for the product you want to enable the API for. In my case, I searched for "BigQuery". And the "BigQuery API" came right up. Click that and enable the API.

Setting up a Service Account

Now that we enabled API access, we need to create a user whose credentials we'll use to access GCP. You can not use your own user account, as that would be unsafe.

The GCP documentation describes Service Accounts as:

A service account is a special account that can be used by services and applications [...] to interact with other Google Cloud Platform APIs. Apps can use service account credentials to authorize themselves to a set of APIs and perform actions within the permissions granted to the service account [...].

Sounds just like what we need. To create a Service Account, open the Navigation Menu in the GCP Console and open IAM & admin > Service accounts. Then click the Create Service Account button. Give a name and a description and click Create.

The next step is to grant permission to the newly created Service Account. Ensure that the account is given the right permissions for the APIs you want to use. In my case, I gave it the BigQuery > BigQuery Admin role. Click Continue to advance.

In the last step of the service account creation, we need to create a key. We'll later use this key to access GCP from our Netlify function. Click Create Key and select JSON as the Key type. Press Create and the key will be created and downloaded automatically. Keep this file around, as it contains the credentials for the newly created service account.

Press Done to complete the creation of the service account.

Adding the credentials to Netlify

When we created the service account in the previous step, a JSON file with credentials for that account was downloaded automatically. We'll now store that information as environment variables in Netlify. We can later read that information from our Netlify function.

Inspect the contents of the newly downloaded JSON file. It will look something like this

{
  "type": "service_account",
  "project_id": "your-project-123456",
  "private_key_id": "000011112222333444555666777888999",
  "private_key": "-----BEGIN PRIVATE KEY-----\ngibberish-letters\n-----END PRIVATE KEY-----\n",
  "client_email": "service-account-for-article@your-project-123456.iam.gserviceaccount.com",
  "client_id": "123123123123",
  "auth_uri": "https://accounts.google.com/o/oauth2/auth",
  "token_uri": "https://oauth2.googleapis.com/token",
  "auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
  "client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/service-account-for-article%40wyour-project-1223456.iam.gserviceaccount.com"
}

We are only intersted in the values of client_email and private_key.

Open your site's build settings on Netlify (Your site > Settings > Build & deploy > Environment). This is the place to edit the environment variables which are available during the build process of your site, and more importantly during the execution of your Netlify functions.

Create a new environment variable called GCP_CLIENT_EMAIL and provide the value of client_email from the JSON file above. In my case this would be GCP_CLIENT_EMAIL with a value of service-account-for-article@your-project-123456.iam.gserviceaccount.com.

Next, add another environment variable called GCP_PRIVATE_KEY and set its value to the value of private_key above. In my case it would be -----BEGIN PRIVATE KEY-----\ngibberish-letters\n-----END PRIVATE KEY-----\n.

You may notice the newline characters in the private key. We'll deal with them later. Provide the value as-is for now.

Save the environemnt variables using the Save button and we're ready for the final step.

In your Netlify function

These steps here will depend on which GCP product you're attempting to use. I'll outline the steps for BigQuery, but the same principle should apply to all of them.

First, I added the @google-cloud/bigquery package to my project by using npm install @google-cloud/bigquery.

The next step was to import the package into my function.

const { BigQuery } = require("@google-cloud/bigquery");

And finally, I instantiated a new BigQuery client using the environment variables we set up in the previous step.

const bigQueryClient = new BigQuery({
  projectId: 'your-project-id,
  credentials: {
    client_email: process.env.GCP_CLIENT_EMAIL,
    private_key: process.env.GCP_PRIVATE_KEY.split("\\n").join("\n")
  }
});

This is the gotcha which has caused me quite some time:

Notice that I'm calling .split("\\n").join("\n") on the process.env.GCP_PRIVATE_KEY. This will essentially replace the escaped \n characters of the service account's private key with actual newline characters.

If you're familiar with RegEx, you could use this instead:

process.env.GCP_PRIVATE_KEY.replace(/\\n/g, "\n");

You are now free to make your wildest GCP dreams come true with Netlify. Use the instantiated bigQueryClient to access BigQuery from your Netlify function!

Here is an example, which inserts rows into BigQuery:

const { BigQuery } = require("@google-cloud/bigquery");

exports.handler = async (event, context) => {
  try {
    // Some example data
    const rows = [
      { name: "John", action: "Signed up" },
      { name: "Fritz", action: "Logged in" },
    ];

    // prepare BigQuery client
    const bigQueryClient = new BigQuery({
      projectId: "your-project-id",
      credentials: {
        client_email: process.env.GCP_CLIENT_EMAIL,
        private_key: process.env.GCP_PRIVATE_KEY.split("\\n").join("\n"),
      },
    });

    // Store rows in Big Query
    const datasetId = "events_dataset";
    const tableId = "events_table";
    const [insertResult] = await bigQueryClient
      .dataset(datasetId)
      .table(tableId)
      .insert(rows);

    return { statusCode: 200, body: JSON.stringify(insertResult) };
  } catch (e) {
    return { statusCode: 500, body: e.message };
  }
};