Keboola’s Data Streams feature allows users to receive streaming event data directly into Keboola Storage without needing additional steps, such as setting up a data source, using middleware, or developing a custom component. It simplifies the process and enables ad hoc messaging from your system to Keboola Storage.
The most important benefits of the Data Streams feature include:
The Data Streams feature receives messages through HTTP and saves them into the database once predefined conditions (record count, total size, or time) are met. The service uses Keboola’s Buffer API for smooth data management. Learn more
Follow these steps to create a data stream:
For every data stream, a unique “Data Stream URL” is generated. You can use it in your application to send events. This URL cannot be changed.
This dashboard shows the status of data waiting for import vs. imported data.
In your table settings, you can:
For easier use, we’ve prepared a few examples of how to send data to a stream using Python, Javascript, and Bash.
In this section, you can set a few conditions for importing data. If any of these three conditions are met, events are instantly uploaded to the destination table. You can set the import time frequency, the size of the imported data, or the number of imported records. Learn more
Here, you can simulate your payload and test it instantly with a table preview to see how the data will be imported before deploying it into production.
Data Streams pricing details vary based on the number of streams and the volume of data ingested.
For further details and API integration steps, refer to our comprehensive documentation.