The BigQuery extractor loads data from BigQuery and brings it into Keboola. Running the extractor creates a background job that
Note: Using the Google BigQuery extractor is also described in our Getting Started Tutorial.
To access and extract data from your BigQuery dataset, you need to set up a Google service account. Go
to Google Cloud Platform Console > IAM & admin > Service accounts
and select the project you want the extractor to have access to. Click Create Service Account
and enter a Service account name (e.g.,
Keboola BigQuery extractor).
Then add the roles
BigQuery Data Editor,
BigQuery Job User and
Storage Object Admin.
Finally, click + Create Key to create a new JSON key, and then click Create to download it to your computer.
The extractor uses a Google Storage bucket as a temporary storage for off-loading the data from BigQuery. Go to the Google Cloud Platform Console > Storage > Cloud Storage > Browser and click Create Bucket. Name the bucket and select its location (must be the same as of your dataset).
Do not set a retention policy on the bucket. The bucket contains only temporary data and no retention is needed.
Create a new configuration of the BigQuery extractor. Click Set Service Account Key.
Open the downloaded key in a text editor, copy & paste it in the input field, and click Submit.
Click Save to store the credentials.
Important: The private key is stored in an encrypted form and only the non-sensitive parts are visible in the UI for your verification. The key can be deleted or replaced by a new one at any time.
In the section Unload Configuration, enter
Cloud Storage Bucket Name as the name of the bucket
you have created earlier, and select the correct Dataset Location. Click Save.
Start by clicking the button Add Query.
Name the query and click Create.
To learn how to modify your configuration, go to the SQL databases section.