Airtable To Bigquery

Posted : admin On 1/26/2022

Use CData Sync for automated, continuous, customizable Airtable replication to Google BigQuery. Always-on applications rely on automatic failover capabilities and real-time data access. CData Sync integrates live Airtable data into your Google BigQuery instance, allowing you to consolidate all of your data into a single location for archiving. Walking through running a LEFT JOIN between two public datasets in BigQuery.Learn more and grab the SQL cheat sheet at https://codingisforlosers.com/learn-bi. Integrate Airtable with Google BigQuery Don't waste time on aggregating Airtable and Google BigQuery. Sync Airtable with Google BigQuery in five minutes. Let Albato simplify your daily work. Take data from Airtable and load to Snowflake, Google BigQuery, Amazon Redshift, Azure SQL database and analyze with Looker and Tableau instantly. . Update and improve our node js Cloud Functions that connects to Airtable (our product management and task management tool) All of these systems are currently developed and running in an 'MVP' state. You will be working directly with me and leveraging GitHub for code reviews and source control. Posted On: January 29, 2021 01:49 UTC.

  1. Air Table To Bigquery
  2. Airtable Query
Latest version

Released:

Airtable

Export Airtable data to files on disk

Airtable query

Project description

Export Airtable data to files on disk

Installation

Install this tool using pip:

Usage

You will need to know the following information:

  • Your Airtable base ID - this is a string starting with app...
  • Your Airtable API key - this is a string starting with key...
  • The names of each of the tables that you wish to export

You can export all of your data to a folder called export/ by running the following:

This example would create two files: export/table1.yml and export/table2.yml.

Rather than passing the API key using the --key option you can set it as an environment variable called AIRTABLE_KEY.

Export options

By default the tool exports your data as YAML.

Airtable query

You can also export as JSON or as newline delimited JSON using the --json or --ndjson options:

You can pass multiple format options at once. This command will create a .json, .yml and .ndjson file for each exported table:

SQLite database export

You can export tables to a SQLite database file using the --sqlite database.db option:

This can be combined with other format options. If you only specify --sqlite the export directory argument will be ignored.

The SQLite database will have a table created for each table you export. Those tables will have a primary key column called airtable_id.

If you run this command against an existing SQLite database records with matching primary keys will be over-written by new records from the export.

Request options

By default the tool uses python-httpx's default configurations.

You can override the user-agent using the --user-agent option:

You can override the timeout during a network read operation using the --http-read-timeout option. If not set, this defaults to 5s.

Running this using GitHub Actions

GitHub Actions is GitHub's workflow automation product. You can use it to run airtable-export in order to back up your Airtable data to a GitHub repository. Doing this gives you a visible commit history of changes you make to your Airtable data - like this one.

To run this for your own Airtable database you'll first need to add the following secrets to your GitHub repository:

AIRTABLE_BASE_ID
The base ID, a string beginning `app...`
AIRTABLE_KEY
Your Airtable API key
AIRTABLE_TABLES
A space separated list of the Airtable tables that you want to backup. If any of these contain spaces you will need to enclose them in single quotes, e.g. <samp>'My table with spaces in the name' OtherTableWithNoSpaces</samp>

Once you have set those secrets, add the following as a file called .github/workflows/backup-airtable.yml:

This will run once a day (at 32 minutes past midnight UTC) and will also run if you manually click the 'Run workflow' button, see GitHub Actions: Manual triggers with workflow_dispatch.

Development

To contribute to this tool, first checkout the code. Then create a new virtual environment:

Or if you are using pipenv:

Now install the dependencies and tests:

To run the tests:

Release historyRelease notifications RSS feed

0.7.1

0.7

0.6

0.5

0.4

0.3.1

0.3

0.2

Download files

Table

Download the file for your platform. If you're not sure which to choose, learn more about installing packages.

Files for airtable-export, version 0.7.1
Filename, sizeFile typePython versionUpload dateHashes
Filename, size airtable_export-0.7.1-py3-none-any.whl (9.8 kB) File type Wheel Python version py3 Upload dateHashes
Filename, size airtable-export-0.7.1.tar.gz (5.7 kB) File type Source Python version None Upload dateHashes
AirtableClose

Air Table To Bigquery

Hashes for airtable_export-0.7.1-py3-none-any.whl

Hashes for airtable_export-0.7.1-py3-none-any.whl
AlgorithmHash digest
SHA256803f2578c6c689ab07c758d7d4599f77e28630037bc7318471ba688565c347db
MD55bc5e1cb712e9c1c6aa364cf4d689711
BLAKE2-256945e014144e1f70bd7e6d72c179b3fadb3f5e2bda10393eb175bc3e54b036e5d
Close

Hashes for airtable-export-0.7.1.tar.gz

Airtable Query

Hashes for airtable-export-0.7.1.tar.gz
AlgorithmHash digest
SHA2561dd3e6434d97c86eac9bd1c95b33ee8c29b1f58c1f1684a4f9ca541533b9c4c1
MD53d692601e7abc046231eee107de6b562
BLAKE2-256d44f897f00a5cc50baccd793027554b5e5b094109e8e69167da22a13316fa34a

Always-on applications rely on automatic failover capabilities and real-time data access. CData Sync integrates live Airtable data into your Google BigQuery instance, allowing you to consolidate all of your data into a single location for archiving, reporting, analytics, machine learning, artificial intelligence and more.

Configure Google BigQuery as a Replication Destination

Using CData Sync, you can replicate Airtable data to Google BigQuery. To add a replication destination, navigate to the Connections tab.

  1. Click Add Connection.
  2. Select Google BigQuery as a destination.
  3. Enter the necessary connection properties. To connect to Google BigQuery, use OAuth authentication:

    Authenticate with a User Account

    In this OAuth flow, you can connect without setting any connection properties for your user credentials.

    1. Click Connect, and CData Sync opens the Google BigQuery OAuth endpoint.
    2. Log in and grant permissions to CData Sync.
    3. CData Sync then completes the OAuth process.

    Authenticate with a Service Account

    Service accounts have silent authentication, without user authentication in the browser. You can also use a service account to delegate enterprise-wide access scopes to CData Sync.

    You need to create an OAuth application in this flow. See Creating a Custom OAuth App in the Getting Started section to create and authorize an app. You can then connect to Google BigQuery data that the service account has permission to access.

    After setting the following connection properties, you are ready to connect:

    • OAuthClientId: Set this to the Client ID in your app settings.
    • OAuthClientSecret: Set this to the Client Secret in your app settings.
    • OAuthJWTCertType: Set this to 'PEMKEY_FILE'.
    • OAuthJWTCert: Set this to the path to the .pem file you generated.
    • OAuthJWTCertPassword: Set this to the password of the .pem file.
    • OAuthJWTCertSubject: Set this to '*' to pick the first certificate in the certificate store.
    • OAuthJWTSubject: Set this to the email address of the user for whom the application is requesting delegate access. Note that delegate access must be granted by an administrator.
    • DatasetId: Set this to the ID of the dataset you want to connect to.
    • ProjectId: Set this to the ID of the project you want to connect to.
    When you connect, CData Sync completes the OAuth flow for a service account.
  4. Click Test Connection to ensure that the connection is configured properly.
  5. Click Save Changes.

Configure the Airtable Connection

You can configure a connection to Airtable from the Connections tab. To add a connection to your Airtable account, navigate to the Connections tab.

  1. Click Add Connection.
  2. Select a source (Airtable).
  3. Configure the connection properties.

    APIKey, BaseId and TableNames parameters are required to connect to Airtable. ViewNames is an optional parameter where views of the tables may be specified.

    • APIKey : API Key of your account. To obtain this value, after logging in go to Account. In API section click Generate API key.
    • BaseId : Id of your base. To obtain this value, it is in the same section as the APIKey. Click on Airtable API, or navigate to https://airtable.com/api and select a base. In the introduction section you can find 'The ID of this base is appxxN2ftedc0nEG7.'
    • TableNames : A comma separated list of table names for the selected base. These are the same names of tables as found in the UI.
    • ViewNames : A comma separated list of views in the format of (table.view) names. These are the same names of the views as found in the UI.
  4. Click Connect to ensure that the connection is configured properly.
  5. Click Save Changes.

Configure Replication Queries

CData Sync enables you to control replication with a point-and-click interface and with SQL queries. For each replication you wish to configure, navigate to the Jobs tab and click Add Job. Select the Source and Destination for your replication.

Replicate Entire Tables

To replicate an entire table, click Add Tables in the Tables section, choose the table(s) you wish to replicate, and click Add Selected Tables.

Customize Your Replication

You can use the Transform feature to customize your replication. The Transform feature allows you to specify which columns to replicate, rename the columns at the destination, and even perform operations on the source data before replicating. To customize your replication, click the Transform button in the Tables section and customize the replication.

Schedule Your Replication

In the Schedule section, you can schedule a job to run automatically, configuring the job to run after specified intervals ranging from once every 15 minutes to once every month.

Once you have configured the replication job, click Save Changes. You can configure any number of jobs to manage the replication of your Airtable data to Google BigQuery.