This data destination connector sends data to a PostgreSQL database.
Create a new configuration of the PostgreSQL data destination connector.
The first step is to Set Up Database Credentials. You need to provide a host name, user name, password, database name, and schema.
We highly recommend that you create dedicated credentials for the connector in your database. You can use the following SQL code to get started:
It is also possible to secure the connection using an SSH tunnel.
The next step is to configure the tables you want to write. Click Add New Table and select an existing table from Storage:
The next step is to specify table configuration. Use the preview icon to peek at the column contents.
For each column you can specify its
IGNORE
means that column will not be present in the destination table.''
) in that column will be converted to NULL
. Use this for non-string columns with missing data.When done configuring the columns, don’t forget to save the settings.
At the top of the page, you can specify the target table name and additional load options. There are two main options how the connector can write data to tables — Full Load and Incremental Load.
In the Incremental Load mode, the data are bulk inserted into the destination table and the table structure must match (including the data types). That means the structure of the target table will not be modified. If the target table doesn’t exist, it will be created. If a primary key is defined on the table, the data is upserted. If no primary key is defined, the data is inserted.
In the Full Load mode, the table is completely overwritten including the table structure. The table is removed
using the DROP
command and recreated. The
DROP
command needs to acquire a table-level lock.
This means that if the database is used by other applications which acquire table-level locks, the connector may
freeze waiting for the locks to be released. This will be recorded in the connector logs with a message similar to this:
Table "account" is locked by 1 transactions, waiting for them to finish
Additionally, you can specify a Primary key of the table, a simple column Data filter, and a filter for incremental processing.