All Collections
Data Connectors
Other solutions
How to import BigQuery data to Funnel
How to import BigQuery data to Funnel

Move data from BigQuery tables to Funnel using a webhook

Ross Barrett avatar
Written by Ross Barrett
Updated over a week ago

In this technical documentation, we’ll explain how to import data from BigQuery tables, by creating files and sending them to Funnel via our webhook. To learn more about our webhook in general and to discover less technical methods of integration, please see this article.

Prerequisites

Before starting, you need to have a data set and at least one table in BigQuery Studio. You don't need special permissions or access at this stage.

Overview


1. Create a Google Cloud Storage bucket

If you haven’t created a bucket already, we’ll need to create one to store the files we create. You can learn more about creating buckets in Google's Cloud Storage guide.

2. Set up a scheduled query

Create a new scheduled query to export data periodically as CSV files from your BigQuery table to your Google Cloud Storage bucket. Simplified example:

DECLARE backup_date DATE DEFAULT DATE_SUB(@run_date, INTERVAL 3 day);

EXPORT DATA
OPTIONS ( uri = CONCAT('gs://webhook_example_bucket/example-export', CAST(backup_date AS STRING), '-*.csv'),
format='CSV',
overwrite=TRUE,
header=TRUE ) AS
SELECT
DISTINCT *
FROM
`example-project.example_data.example_table`

3. Create a webhook data source in Funnel

Start by heading to the Data Sources section, click "Connect Data Source", and select File Import, or Webhook from the list.

Give the File Import source a name and make sure Webhook is selected under transport types.

​If you already have a File Import source, you can click on “Transport types” in the top right corner of the existing source, select the webhook transport type and click save.

We’ll need the Endpoint URL and Token from this section when writing our function in a later step.

4. Create and deploy a Google Cloud Function

Create a Google Cloud Function with a Cloud Storage trigger.

Set up the Cloud Storage trigger for the function and select the bucket where the files you want to send to Funnel are stored. It should trigger when new files are added to the bucket. Do not check “Retry on failure”.

Write the code for your Google Cloud Function. It should create a list of signed URLs containing the files you wish to send to Funnel. Then send the list as a POST request to the webhook URL.

The list must follow the format below. Ie. it should be sent as a JSON body that looks like this: {“urls”: [“https://url-1.csv”, “https://url-2.csv”]}. To authenticate you must send your secret token in the “x-funnel-fileimport-token” header. Note that this is just an example. Please add the required security measures when implementing this step.

Google Cloud Function Example Code

index.js

import { cloudEvent } from '@google-cloud/functions-framework';
import fetch from 'node-fetch';
import { Storage } from '@google-cloud/storage';

const storage = new Storage();

cloudEvent('exampleFunction', async (ce, res) => {
console.log(`Event ID: ${ce.id}`);
console.log(`Event Type: ${ce.type}`);

const file = ce.data;
console.log(`Bucket: ${file.bucket}`);
console.log(`File: ${file.name}`);
console.log(`Metageneration: ${file.metageneration}`);
console.log(`Created: ${file.timeCreated}`);
console.log(`Updated: ${file.updated}`);

const url = 'https://fileimport-webhook.funnel.io/123-123-123-123';
const signedUrl = await generateV4ReadSignedUrl(file.bucket, file.name).catch(console.error);
const response = await fetch(url, {
method: "POST",
body: JSON.stringify({
"urls": [signedUrl]
}),
headers: {
"x-funnel-fileimport-token": "funnel-secret-token-goes-here",
"Content-Type": "application/json"
}
});
res(response);
});

async function generateV4ReadSignedUrl(bucketName, fileName) {
// These options will allow temporary read access to the file
const options = {
version: 'v4',
action: 'read',
expires: Date.now() + 15 * 60 * 1000, // 15 minutes
};

// Get a v4 signed URL for reading the file
const [url] = await storage
.bucket(bucketName)
.file(fileName)
.getSignedUrl(options);

console.log('Generated GET signed URL:');
console.log(url);
console.log('You can use this URL with any user agent, for example:');
console.log(`curl '${url}'`);
return url;
}


package.json

{
"type": "module",
"dependencies": {
"@google-cloud/functions-framework": "^3.0.0",
"@google-cloud/storage": "7.1.0",
"node-fetch": "^3.3.2"
}
}

5. Make sure everything works

Start by manually running a backfill in the scheduler. This sends files to the bucket without waiting for the scheduler to trigger.

To verify that your function was executed, you can look in the scheduler log. You should also verify that the file was created in the bucket.

You can also look at the function response in the functions log before verifying that the file is visible in Funnel.

At this stage, you may be redirected to the configuration view in Funnel where you'll need to define your fields and set other preferences. If you're not redirected automatically, click. on "Finish configuration" in the top right corner of the Data Source.

Providing all future files are formatted in the same way, you'll only have to do the configuration in Funnel once.

Did this answer your question?