With this end-to-end flow you can extract your updated data from an eCommerce platform and bring it into Keboola. After all the necessary tasks are performed on the data, you can transform the results into visualizations in any BI tool of your choice.
The flow, in a nutshell:
First, the eCommerce source component (extractor) will collect data from your account (data about orders, products, inventory, and customers). You can also bring in marketing data from your marketing channels like Facebook Ads, Google Ads, and/or Sklik accounts.
We then create the output tables. We add NULL values if any columns are missing. We also check the data, and perform an RFM analysis.
The data is then written into your selected destination, for example to Snowflake database via the Snowflake destination component (writer).
Finally, you will run the entire flow (i.e., the sequence of all the prepared, above mentioned steps, in the correct order). The eCommerce source component, all data manipulations and analyses, and the destination component of your choice, will be processed.
An entity-relationship diagram is a specialized graphic that illustrates the relationships between entities in a data destination.
|contains list of products including product type, product manufacturer and product price
|contains list of customer orders including order date, purchase, price and taxes
|contains list of customers, incl. email, customer billing and shipping information, and orders count
|contains information about your shop such as URL and name
|enumeration for billing types
|enumeration for shipping types
|contains individual items to orders, incl. order date, amount of bought items and item prices
|PRODUCT MARGIN OVER TIME
|contains product margin over time, used to check how margin on each product changes in time
|contains daily marketing campaign costs and clicks
|CAMPAIGN COSTS MONTHLY
|contains monthly marketing campaign costs and clicks
|describes analysis of customer value (RFM), showing actual group of customers as well as prediction for moving from loyal customer to frozen customer
|ANALYZE CLV BY TIME FROM PREVIOUS ORDER
|contains analysis of customer lifetime value by time
|ANALYZE CLV BY ORDER COUNT
|contains analysis of customer lifetime value by number of orders
These are the data sources that are available in Public Beta:
These data destinations are available in Public Beta:
The process is simple. We will guide you through it, and, when needed, ask you to provide your credentials and authorize the destination component.
First decide which Data Source and which Data Destination you want to use. Then select the corresponding template from the Templates tab in your Keboola project. When you are done, click + Use Template.
This page contains information about the template. Click + Use Template again.
Now enter a name for the template instance that you are about to create. This allows you to use the template as many times as you want. It is important to keep things organized.
After clicking Next Step, you will see the template builder. Fill in all needed credentials and perform the required OAuth authorizations.
Important: Make sure to follow all the steps very carefully to prevent the newly created flow from failing because of any user authorization problems. If you are struggling with this part, go to the section Authorizing Data Destinations below.
Follow the steps one by one and authorize your data sources. An eCommerce data source is required. In this case, it is Shopify. The marketing data sources are optional.
Note: Using additional marketing sources is optional. You can (but you do not have to) enrich your eCommerce data with marketing statistics like costs, impressions, and clicks.
Finally, the destination must be authorized as well.
When you are finished, click Save in the top right corner. The template builder will create your new configuration and when it is done, you will be redirected to the Template Catalogue where you can see the newly created flow.
Click Run Template and start building your visualizations a few minutes later.
To use a selected data source component, you must first authorize the data source.
At least one data source must be used in order to create a working flow.
To enable this application, you must:
Read access ADMIN API PERMISSIONS for the following objects:
Additional documentation is available here.
This extractor allows you to download data from Shoptet permalinks.
Find all links in your Shoptet account.
Example: Go to Customers and click Export. At the bottom you will see a link that looks like this: https://www.yourshopaddress.domain/export/customers.xml?ip=220.127.116.111&hash=somehash
Using the following marketing data sources is optional. Select the ones you wish to use.
Log into Facebook with redirection from this step and allow Keboola to access the data.
From the list of accounts select the accounts from which you want to download data.
Authorize your Google Account and then select the account from which you want to obtain data. The list will automatically appear after authorization.
To configure this source component, you need to have a working
The current listing limit supported by the Sklik API is 100.
To create a working flow, you must select at least one data destination.
To configure the destination component, you need to set up a Google Service Account and create a new JSON key.
A detailed guide is available here.
Authorize your Google account.
Duplicate the sheet into your Google Drive and paste the file ID back into Keboola. This is needed to achieve correct mapping in your duplicated Google sheet.
If you do not have your own data warehouse, follow the instructions and we will create a database for you:
Everything is set up.
If you want to use your own Snowflake database, you must provide the host name (account name), user name, password, database name, schema, and a warehouse.
We highly recommend that you create a dedicated user for the destination component in your Snowflake database. Then you must provide the user with access to the Snowflake Warehouse.
Warning: Keep in mind that Snowflake is case sensitive and if identifiers are not quoted, they are converted to upper case. So if you run, for example, a query CREATE SCHEMA john.doe;, you must enter the schema name as DOE in the destination component configuration.
More info here.
Before turning to the Keboola support team for help, make sure your error is not a common problem that can be solved without our help.
If you see the error pictured below, you have probably forgotten to set up the Snowflake database.
Click on the highlighted text under Configuration in the top left corner. This will redirect you to the Snowflake Database component. Now follow the Snowflake Database provided by Keboola on the page Authorizations/destinations.
Then go to the Jobs tab and Run the flow again.