BigQuery
source: documentation: author: supabase tags:
- native
- official
BigQuery
BigQuery is a completely serverless and cost-effective enterprise data warehouse that works across clouds and scales with your data, with BI, machine learning and AI built in.
The BigQuery Wrapper allows you to read and write data from BigQuery within your Postgres database.
Restoring a logical backup of a database with a materialized view using a foreign table can fail. For this reason, either do not use foreign tables in materialized views or use them in databases with physical backups enabled.
Supported Data Types
Postgres Type | BigQuery Type |
---|---|
boolean | BOOL |
bigint | INT64 |
double precision | FLOAT64 |
numeric | NUMERIC |
text | STRING |
varchar | STRING |
date | DATE |
timestamp | DATETIME |
timestamp | TIMESTAMP |
timestamptz | TIMESTAMP |
Preparation
Before you get started, make sure the wrappers
extension is installed on your database:
_10create extension if not exists wrappers with schema extensions;
and then create the foreign data wrapper:
_10create foreign data wrapper bigquery_wrapper_10 handler big_query_fdw_handler_10 validator big_query_fdw_validator;
Secure your credentials (optional)
By default, Postgres stores FDW credentials inide pg_catalog.pg_foreign_server
in plain text. Anyone with access to this table will be able to view these credentials. Wrappers is designed to work with Vault, which provides an additional level of security for storing credentials. We recommend using Vault to store your credentials.
_15-- Save your BigQuery service account json in Vault and retrieve the `key_id`_15insert into vault.secrets (name, secret)_15values (_15 'bigquery',_15 '_15 {_15 "type": "service_account",_15 "project_id": "your_gcp_project_id",_15 "private_key_id": "your_private_key_id",_15 "private_key": "-----BEGIN PRIVATE KEY-----\n...\n-----END PRIVATE KEY-----\n",_15 ..._15 }_15 '_15)_15returning key_id;
Connecting to BigQuery
We need to provide Postgres with the credentials to connect to BigQuery, and any additional options. We can do this using the create server
command:
_10create server bigquery_server_10 foreign data wrapper bigquery_wrapper_10 options (_10 sa_key_id '<key_ID>', -- The Key ID from above._10 project_id 'your_gcp_project_id',_10 dataset_id 'your_gcp_dataset_id'_10 );
Creating Foreign Tables
The BigQuery Wrapper supports data reads and writes from BigQuery.
Integration | Select | Insert | Update | Delete | Truncate |
---|---|---|---|---|---|
BigQuery | ✅ | ✅ | ✅ | ✅ | ❌ |
For example:
_10create foreign table my_bigquery_table (_10 id bigint,_10 name text,_10 ts timestamp_10)_10 server bigquery_server_10 options (_10 table 'people',_10 location 'EU'_10 );
Foreign table options
The full list of foreign table options are below:
-
table
- Source table or view name in BigQuery, required.This can also be a subquery enclosed in parentheses, for example,
_10table '(select * except(props), to_json_string(props) as props from `my_project.my_dataset.my_table`)'Note: When using subquery in this option, full qualitified table name must be used.
-
location
- Source table location, optional. Default is 'US'. -
timeout
- Query request timeout in milliseconds, optional. Default is '30000' (30 seconds). -
rowid_column
- Primary key column name, optional for data scan, required for data modify
Query Pushdown Support
This FDW supports where
, order by
and limit
clause pushdown.
Inserting Rows & the Streaming Buffer
This foreign data wrapper uses BigQuery’s insertAll
API method to create a streamingBuffer
with an associated partition time. Within that partition time, the data cannot be updated, deleted, or fully exported. Only after the time has elapsed (up to 90 minutes according to BigQuery’s documentation); can you perform operations.
If you attempt an UPDATE
or DELETE
statement on rows while in the streamingBuffer, you will get an error of UPDATE
or DELETE
statement over table datasetName - note that tableName would affect rows in the streaming buffer, which is not supported.
Examples
Some examples on how to use BigQuery foreign tables.
Let's prepare the source table in BigQuery first:
_12-- Run below SQLs on BigQuery to create source table_12create table your_project_id.your_dataset_id.people (_12 id int64,_12 name string,_12 ts timestamp_12);_12_12-- Add some test data_12insert into your_project_id.your_dataset_id.people values_12 (1, 'Luke Skywalker', current_timestamp()),_12 (2, 'Leia Organa', current_timestamp()),_12 (3, 'Han Solo', current_timestamp());
Basic example
This example will create a "foreign table" inside your Postgres database called people
and query its data:
_12create foreign table people (_12 id bigint,_12 name text,_12 ts timestamp_12)_12 server bigquery_server_12 options (_12 table 'people',_12 location 'EU'_12 );_12_12select * from people;
Data modify example
This example will modify data in a "foreign table" inside your Postgres database called people
, note that rowid_column
option is mandatory:
_24create foreign table people (_24 id bigint,_24 name text,_24 ts timestamp_24)_24 server bigquery_server_24 options (_24 table 'people',_24 location 'EU',_24 rowid_column 'id'_24 );_24_24-- insert new data_24insert into people(id, name, ts)_24values (4, 'Yoda', '2023-01-01 12:34:56');_24_24-- update existing data_24update people_24set name = 'Anakin Skywalker'_24where id = 1;_24_24-- delete data_24delete from people_24where id = 2;