Google BigQuery: Schedule, Trigger, Monitor, and Orchestrate Operations

Disclaimer

Your use of this download is governed by Stonebranch’s Terms of Use, which are available at https://www.stonebranch.com/integration-hub/Terms-and-Privacy/Terms-of-Use/

Overview

This Universal Task allows Stonebranch users to schedule, trigger, monitor, and orchestrate the Google BigQuery process directly from Universal Controller.  

  • This task uses Python modules google-cloud-bigquery and google-auth to make REST-API calls to Google BigQuery

  • This task will use the GCP Project ID, BigQuery SQL or Schema, Dataset ID, Job ID, Location, Table ID, Cloud Storage URI, and Source File Format as parameters of BigQuery function, and GCP KeyFile (API KEY) of Service account for authenticating the REST-API calls to Google BigQuery.

  • User can perform the following Google BigQuery operations:

    • BigQuery SQL

    • List dataset

    • List tables in dataset

    • View job information

    • Create a dataset

    • Load local file to a table

    • Load cloud storage data to a table

    • Export table data

Version Information

Template NameVersion
CS Google BigQuery1.1.4

Refer to changelog for version history information.

Software Requirements

This Universal Task requires a Universal Agent and a Python runtime to execute the Universal Task against a Google BigQuery data warehouse.

Software Requirements for Universal Template and Universal Task

  • Requires Python 3.4 or higher. Tested with the Universal Agent bundled Python distribution.

  • Python modules required:

    • google-cloud-bigquery

Software Requirements for Universal Agent

Either:

  • Universal Agent for Windows x64 Version 7.0 and later with Python options installed.

  • Universal Agent for Linux Version 7.0 and later with Python options installed.

Software Requirements for Universal Controller

  • Universal Controller Version 7.0.0.0 and later.

Software Requirements for the Application to be Scheduled

This Universal Task has been tested with the Google BigQuery data warehouse.

Technical Considerations

  • This Universal Task uses the Python modules Google auth and Google BigQuery management to make REST-API calls with Google BigQuery

  • Create a service account in your GCP identity management

Key Features

Feature

Description

BigQuery SQL

Runs a BigQuery SQL query and returns query results.

List dataset

Lists all existing datasets in a particular project.

List tables in dataset

List tables in a particular dataset.

View job information

Retrieve the information of a job for a given job ID.

Create a dataset

Create a dataset within a project.

Load local file to a table

Load a local file to a BigQuery table.

Load cloud storage data to a table

Load a file form cloud storage to BigQuery table.

Export table data

Export table to cloud storage bucket.

Import Google BigQuery Integration Downloadable Universal Template

To use this downloadable Universal Template, you first must perform the following steps:

  1. This Universal Task requires the Resolvable Credentials feature. Check that the Resolvable Credentials Permitted system property has been set to true.
  2. To import the Universal Template into your Controller, follow the instructions here.
  3. When the files have been imported successfully, refresh the Universal Templates list; the Universal Template will appear on the list.

Configure Google BigQuery Integration Universal Task

For the new Universal Task type, create a new task, and enter the task-specific details that were created in the Universal Template.

Field Descriptions for Google BigQuery Integration Universal Task

Field

Description

GCP Project ID

Provide Google Cloud Platform (GCP) project ID.

GCP KeyFile (API KEY) Service account

Provide Google Cloud Platform Service account authentication key in JSON format.

Select a BigQuery Function

Select a function for execution in GCP.

BigQuery SQL or Schema

Provide BigQuery SQL or schema as applicable for function to load data either from cloud or local storage.

Dataset ID

Provide GCP Dataset ID (name of the dataset) - Must be alphanumeric.

Job ID

Provide BigQuery Job ID.

Location

Provide GCP BigQuery location.

Table ID

Provide table ID(Name of the table where the file needs to be loaded).

Cloud Storage URI

Provide URI for the cloud storage.

Source File Format

Provide source file format for the data load in to table.

Local File Path

Provide local file path for the data load in to a table.

Header Row to Skip

Provide an integer indicating the number of header rows in the source data.

Schema Auto Detect

Option to enable schema auto detection when loading data applicable only for JSON and CSV format.

Examples for Google BigQuery Integration Universal Tasks

BigQuery SQL

List dataset

List Tables in dataset

View Job Information

Create a dataset

Load Local File to a Table

Load Cloud Storage Data to a Table

Export Table Data


Document References

This document references the following documents:

Name

Description

Universal Templates

User documentation for creating Universal Templates in the Universal Controller user interface.

Universal Tasks

User documentation for creating Universal Tasks in the Universal Controller user interface.


Changelog

ut-cs-google-bigquery 1.1.4 (2024-08-08)

  • Fix: Usage for the Double quotes for the _scriptPath

ut-cs-google-bigquery 1.1.3 (2022-03-25)

  • Fix: Icon change

ut-cs-google-bigquery 1.0.0 (2020-11-17)

  • Initial Version