After manipulating data in SQL and writing data to Tableau, let’s now write data to GoodData.
You need to have a GoodData account before you start.
Writing data to GoodData is very similar to writing data to Tableau, although there are some changes due to the fundamental differences in both platforms. The GoodData platform uses the concept of Logical Models where multiple tables are loaded into the platform together with their logical connection model (schema). Keboola Connection will assist you in creating the model.
To load individual tables instead of denormalized tables, the transformation created earlier has to be modified. Go to Transformations and create a new transformation bucket and a new transformation. For the sake of practicing, let’s create a brand new transformation instead of modifying the existing one.
Apart from creating a new transformation, we also need a new transformation bucket, since the Tableau and GoodData transformations are not really related. If they were more complex, we would take out the similar parts into another transformation. Name the new transformation bucket Opportunity - GoodData and choose the Snowflake backend.
Then add a new transformation and name it.
Now set the transformation input mapping. Include the following tables from the in.c-csv-import
storage bucket:
opportunity
, account
, level
, and user
.
If you loaded data using the
Database extractor or Google Drive extractor,
feel free to use the tables created by them (e.g., in.c-keboola-ex-db-snowflake-548904898.account
or in.c-keboola-ex-google-drive-548902224.level-level
). In either case, make sure that the destinations
are set to account
, opportunity
, user
and level
.
Then create the output mapping for the out_opportunity
, out_user
, and out_account
tables
to be stored in the out.c-opportunity-gooddata
output bucket.
Use the following four SQL queries to create the output tables.
CREATE TABLE "tmp_level" AS
SELECT "Name", CASE "Level"
WHEN 'S' THEN 'Senior'
WHEN 'M' THEN 'Intermediate'
WHEN 'J' THEN 'Junior' END AS "Level"
FROM "level";
CREATE TABLE "out_opportunity" AS
SELECT *, CASE
WHEN "Probability" < 50 THEN 'Poor'
WHEN "Probability" < 70 THEN 'Good'
ELSE 'Excellent' END AS "ProbabilityClass"
FROM "opportunity";
CREATE TABLE "out_user" AS
SELECT "user"."Name" AS "Name", "user"."Sales_Market" AS "UserSalesMarket",
"user"."Global_Market" AS "UserGlobalMarket"
FROM
"user" JOIN "tmp_level" ON "user"."Name" = "tmp_level"."Name";
CREATE TABLE "out_account" AS
SELECT * FROM "account";
Run the transformation.
This will create a background job which will
out.c-opportunity-gooddata
output bucket in Storage.To see if the transformation job has finished, go to Jobs, or click on the little Transformations job has been scheduled window that pops up after a transformation starts running. When finished, or while waiting for the job to end, continue configuring the GoodData writer.
Start by creating a new writer in the Components – Writers section:
The GoodData writer can have multiple configurations (as any other writer or extractor). Each configuration represents a set of data loaded into a single GoodData project. New Configuration to continue:
And choose its name:
Set Up GoodData Project to continue:
Keboola Connection can create a free Demo GoodData project for you. However, it expires in one month from the date it was created.
Let’s create a Date Dimension.
Name the dimension first_order
:
Now let’s configure the tables that are to be loaded to the project.
Add the account
table from the out.c-opportunity-gooddata
bucket. When adding a table,
simplify the table title to just the table name (we have only few tables).
Configure the type of each column. Mark
CONNECTION_POINT
,FACT
,ATTRIBUTE
, andDATE
.Do not set the Data Type column.
Set the previously created date dimension first_order
to the FirstOrder column.
Save the column settings.
Then go back to the writer configuration, and add the out.c-opportunity-gooddata.user
table.
Name the table user and set the Name column to CONNECTION_POINT
and everything else to ATTRIBUTE
.
Save the table configuration, and go back to the writer configuration.
Add four other date dimensions called created_date
, close_date
, start_date
and end_date
. In case of created_date
, tick the Include Time checkbox when creating the date dimension. The result should look like this:
Add the third table called
out.c-opportunity-gooddata.out_opportunity
. Name it opportunity and set the columns as follows:
FACT
,REFERENCE
and connect them to tables account
and user
,DATE
and connect them to the previously created date dimensions,IGNORE
(we won’t be needing it any more), andATTRIBUTE
.You should obtain the following result:
Save the table configuration, and go back to configuring the writer.
Now click on Run Component to push the tables to GoodData:
The tables will be written into GoodData by a background job. When a job is running, a small orange circle appears under Last runs, along with RunId and other info on the job. Green is for success, red for failure. Click on the indicator, or the info next to it for more details.
In the meantime, click on GoodData Project to reveal other options and Go To Project.
Then create your report:
First, specify a metric. It can be computed from columns we marked as FACT
when setting up the writer (those are Amount and Probability).
Let’s add a metric for SUM
of Amount,
Then specify how this metric should be sliced in the How section. Let’s slice it by
ProbabilityClass
:
Additional slices or filters can be added in the dashboard wizard. To close the wizard, click Done and the result will be shown to you as a table. To change the view, use the appropriate icon in the top right corner.
This will give you the same chart we produced in the Tableau Example.
The tutorial on writing data to GoodData BI platform using Keboola Connection ends here. Continue to Setting up Automation.