September 16, 2024

How to Enable the BigQuery Data Transfer Service?

Learn how to enable the BigQuery Data Transfer Service, update your data transfer to use the service account, create a service job, schedule a backfill, and understand key components.
Dexter Chu
Head of Marketing

How to Enable the BigQuery Data Transfer Service?

To enable the BigQuery Data Transfer Service, you need to have the Owner role for your project. Navigate to the BigQuery Data Transfer API page in the API library, select your project from the dropdown menu, and click ENABLE.

  • The BigQuery Data Transfer Service is a tool that allows for automated data movement from SaaS applications to Google BigQuery.
  • Enabling the service is the first step towards utilizing this tool, and it requires the user to have the Owner role for the project.
  • Once enabled, the service can be configured to transfer data from various sources into BigQuery for analysis.

How to Update Your Data Transfer to Use the Service Account in BigQuery?

To update your data transfer to use the service account in BigQuery, go to the BigQuery Data Transfer Service, click the Manage transfers tab, select the data transfer you want to update, click Edit, select the Service account option, and choose the service account that you added to Google Ads. Don't forget to click Save.

  • Updating the data transfer to use the service account ensures that the transfer is executed with the permissions of the service account.
  • This is a crucial step when setting up automated data transfers, as it allows for seamless data movement without manual intervention.
  • The service account should be added to Google Ads in a previous step before it can be selected for the data transfer.

How to Create a BigQuery Data Transfer Service Job?

To create a BigQuery Data Transfer Service Job, use the appropriate command and specify the Cloud Storage bucket and the CSV file name as the data source. Also, specify the BigQuery dataset and table name as the destination.


# Example command
bq mk --transfer_config \
--target_dataset=mydataset \
--display_name='My Transfer Job' \
--data_source=google_ads \
--params='{"customer_id":"1234567890","refresh_window_days":"30"}'

  • Creating a BigQuery Data Transfer Service Job involves specifying the data source and destination in the command.
  • The data source is typically a Cloud Storage bucket and a CSV file, while the destination is a BigQuery dataset and table.
  • Once the job is created, it can be scheduled to run at specific intervals, ensuring that your BigQuery dataset is always up-to-date with the latest data.

How to Schedule a Backfill in BigQuery Data Transfer Service?

If you're missing any data, you can schedule a backfill in the BigQuery Data Transfer Service. This will fill in any gaps in your data by transferring the missing data from your source to BigQuery.

  • Scheduling a backfill is a useful feature of the BigQuery Data Transfer Service when there are gaps in the data.
  • Backfilling ensures that your BigQuery dataset is complete and accurate, which is crucial for data analysis and decision-making.
  • The backfill process can be scheduled to run at specific times or intervals, depending on your data needs.

When does it make sense to use the BigQuery Data Transfer Service?

The BigQuery Data Transfer Service is best used when you need to automate the movement of data from SaaS applications to Google BigQuery. It's especially useful when you have large volumes of data that need to be regularly transferred and analyzed.

  • The BigQuery Data Transfer Service is a powerful tool for automating data transfers, reducing manual effort and potential errors.
  • It is particularly beneficial for businesses that rely heavily on data analysis and need to ensure their data is always up-to-date and readily available in BigQuery.
  • The service supports a wide range of data sources, making it a versatile solution for many different data transfer needs.

Keep reading

View all