Sending data from BigQuery to Intercom using Google Cloud Functions | Dataform
Guide

Sending data from BigQuery to Intercom using Google Cloud Functions

Clean, well modelled data is useful for more than just analytics. Google Cloud Functions can help you operationalize your data by sending it other services.

EngineeringData stacks
picture of author

Dan Lee on March 9, 2020

Organizations today collect data on their users from many different places: customer support, marketing, ads, order systems and more. Tools like Stitch and Fivetran make it easy to bring all of this data into a cloud data warehouse, where it can be modelled into rich and detailed datasets useful for analytics purposes.

We've built Dataform to make it quick and easy for data teams to set up that modeling layer using SQL based transformation logic, whilst also adopting software engineering inspired best practices like testing to ensure the datasets created are robust and reliable.

Now that that data pipeline is easy to update, but also tested and reliable, it can be used for more than just analytics: you can use your data model to power your operational systems.

As an example, let's say your data modeling layer combines marketing data, customer service data and website event data to give a lead a score between 1 and 10. It would be incredibly useful to have this lead score available in your customer support tool, as a means of prioritizing tickets. At Dataform, our data modeling layer in BigQuery generates a score like this, which we then send to our customer service platform (Intercom).

Using Google Cloud Functions, we can send this lead score to Intercom with only a few lines of code. In this post, I'll show you how to:

  • Create event triggers from BigQuery table updates
  • Create a Google Cloud Function triggered by the event you create
  • Send BigQuery data to Intercom's REST API using a few lines of Python

Whilst I'm using Intercom for this example, the code can be easily modified to work with any product that has a REST API (more on that later).

In this guide I won't go into detail on what exactly a Google Cloud Function is. You can read more about that here

Creating a BigQuery table update event trigger

Cloud functions are triggered by events i.e. something happening at a specific point in time. In our case we'd like to send the results of a data transformation in BigQuery to Intercom, so the most obvious event to use is when that data transformation has finished.

Fortunately, BigQuery automatically sends event logs to Stackdriver when a table has been updated. Google Cloud Functions can't be triggered directly from Stackdriver logs but can be triggered by Cloud Pub/Sub, and it's easy to set up an export from Stackdriver to Pub/Sub. To do this:

  • Open the stackdriver logs export page, making sure you've selected the right GCP project
  • Click CREATE SINK
  • Click the down arrow at the far right of filter text box, and Convert to advanced filter
  • Copy the text below into the filter box, changing dataset_name and table_name to whatever BigQuery table you'd like to use as the trigger. You can Submit Filter to check that the filter is correctly filtering to the table update events you're interested in.
Resource.type = "bigquery_resource";
protoPayload.methodName = "tableservice.update";
protoPayload.serviceData.tableUpdateRequest.resource.tableName.datasetId = "dataset_name";
protoPayload.serviceData.tableUpdateRequest.resource.tableName.tableId = "table_name";
  • On the right you can now save this export:

    • Give the sink a name (e.g. lead_score_updates)
    • Select Google Cloud Pub/Sub as the sink service
    • Create a new Cloud Pub/Sub topic as the sink destination, and give it a name (e.g. lead_score_updates_topic)

You're done! Each time this table gets updated, an event will be sent to the Cloud Pub/Sub topic you just created. Now we need to set up a Google Cloud Function that will be triggered each time this event happens.

Creating a Google Cloud Function

Now we need to create the Cloud Function, which is where the magic happens. Each time an event is sent to the Pub/Sub topic, the Cloud Function will be triggered and the code will be run. Fortunately, Google Cloud Functions are incredibly easy to set up!

  • Open the cloud functions page and click CREATE FUNCTION
  • Give the function a name (e.g. lead_score_to_intercom)
  • Choose Cloud Pub/Sub as the Trigger, and select the topic you just created from the dropdown list (e.g. lead_score_updates_topic)
  • Choose Inline editor as the Source, and Python 3.7 as the Runtime. You can also choose Node and Go, but for this tutorial I'll use Python
  • Click Create

Done! You have now created a Cloud Function that will be triggered each time your table is updated. At the moment, the function doesn't do anything interesting, so now we'll set the function up to read the contents of your table and send it to Intercom.

Sending BigQuery data to Intercom

Now we need to change the code within the Cloud Function to actually do something interesting!

  • Open up the Cloud Function you created from this page
  • Click EDIT

The page that opens lets you edit the code directly, and then once you're finished Deploy a new version of the Function. There are two files: main.py and requirements.txt. requirements.txt lists the set of packages your function will use.

  • Copy the following code into requirements.txt:
# Function dependencies, for example:
# package>=version
requests
google-cloud-bigquery

requests is a package for handling HTTP requests, and google-cloud-bigquery allows us to read data from BigQuery.

main.py is where we define our function.

  • Copy the following into main.py:
import requests
from google.cloud import bigquery

# Intercom API access header
headers = {
    'Authorization': 'Bearer access-token',
    'Accept': 'application/json',
    'Content-Type': 'application/json',
}


# prepare for querying BQ
client = bigquery.Client()

def send_data(event, context):
    # query the data you'd like to send to Intercom
    query = (
    "SELECT email, field_to_send FROM `project-name.dataset_name.table_name` "
    )

    query_job = client.query(
        query,
        location="US",
    )

    # for each row in the data, make a http post request
    for row in query_job:
        email = row.email
        field_to_send = row.field_to_send

        values = f"""
        {{
          "email": "{email}",
            "custom_attributes": {{
            "field_to_send": "{field_to_send}"
            }}
        }}
        """

        response = requests.post('https://api.intercom.io/users', headers=headers, data=values)

        print(response.json())
  • You'll need to change a few aspects of the code to make it work for you:
  • Change the access-token to your own Intercom authorisation key (instructions on getting that here
  • Update the query to collect whatever data you'd like to send to Intercom
  • This example sets a Custom Attribute field_to_send for each email. If you want to do something different, change what's happening in the for loop
  • Finally, you need to set the Function to execute to send_data. This tells the Cloud Function what to do when it's triggered (in this case it's pretty obvious as there's only one function, but in theory the code in main.py could be much more complex
  • Click Deploy

You're done! You now have a Google Cloud Function that will be triggered each time your table is updated. The Cloud Function will POST your data to Intercom's REST API.

Going Beyond Intercom

Whilst this example is specific to Intercom, it's relatively simple to adapt the code for any service that has a REST API (e.g. Salesforce, Autopilot, Zendesk, Stripe). You just need to work out:

  • How to authenticate (i.e. what the headers should look like)
  • What data format the API accepts (i.e. what the values should look like)
  • What route to post to (e.g. https://api.intercom.io/users)

Using Google Cloud Functions in this way, data teams can increase the impact they have on their organization by finding more ways to leverage their data model.

If you'd like to learn more about Dataform and how it can help you set up a robust, SQL-based data modeling layer to support your analytics, data operations and more, check out our docs or book a demo with one of our team.

More content from Dataform

Dataform is joining Google Cloud

Announcement

Dataform is joining Google Cloud

Today we're excited to announce that we've joined Google Cloud. Over the course of the past several months, our partnership with Google Cloud has deepened and we believe that our new combined efforts can make our customers and partners even more successful.
Learn more
Using Javascript to Create an Acquisition Funnel Dataset illustration
Tutorial

Using Javascript to Create an Acquisition Funnel Dataset

How to use a combination of SQL and Javascript to create an acquisition funnel dataset.
Learn more
How to add a new warehouse adapter to Dataform illustration
Guide

How to add a new warehouse adapter to Dataform

I recently wrote a Presto adapter for Dataform during a week long quarterly hackathon. Here's a guide on how I'd do it again.
Learn more

Learn more about the Dataform data modeling platform

Dataform brings open source tooling, best practices, and software engineering inspired workflows to advanced data teams that are seeking to scale, enabling you to deliver reliable data to your entire organization.