This data destination connector sends data to a Google BigQuery dataset.
To access and write to your BigQuery dataset, you need to set up a Google Service Account.
Keboola BigQuery data source
).Then add the BigQuery Data Editor
and BigQuery Job User
roles.
Finally, create a new JSON key (click + Create key) and download it to your computer (click Create).
You can now close the Google Cloud Platform Console and go back to configuring the connector.
Create a new configuration of the BigQuery connector. Click on the Set Service Account Key button. Open the downloaded key in a text editor, copy & paste it in the input field, click Submit and then Save.
Important: The private key is stored in an encrypted form and only the non-sensitive parts are visible in the UI for your verification. The key can be deleted or replaced by a new one at any time. Don’t forget to Save the credentials.
There is one more thing left to do before you can start adding tables. Specify the Google BigQuery Dataset and Save it.
All tables in this configuration will be written to this dataset. If the dataset does not exist, the roles assigned to the Google Service Account will allow the connector to create it.
To add a new table to the connector, click Add Table and select the table. The table name will be used to create the destination table name in BigQuery and can be modified.
Configured tables are stored as configuration rows.
You can specify the table name in BigQuery and set the load type to Full Load
or Incremental
.
Note: Incremental
load type does not use a primary key to modify existing records, new records will be always appended to the table.
You can rename the destination column in BigQuery and specify the used data type. The little eye icon on the right will show you a preview of the values so you don’t have to guess the data type.
Note: You have to define a data type on at least one column for the configuration to work.