Building data pipelines with PostgreSQL and Kafka
I'm the CEO and one of the founders of Aiven, a next-generation managed cloud services company offering the best Open Source database and messaging services to businesses around the world.
I've been using PostgreSQL professionally since the version 6.4 back in 1999.
Apache Kafka is a high-performance open-source stream processing platform for collecting and processing large numbers of messages in real-time. It's used in an increasingly large number of data pipelines to handle events such as website click streams, transactions and other telemetry in real-time and at scale.
This session focuses on connecting Kafka and PostgreSQL to automatically update a relational database with incoming Kafka events, allowing you to use PostgreSQL's powerful data aggregation and reporting features on the live data stream.
We'll demonstrate a production IoT setup Kafka and PostgreSQL to monitor device states.
- 2018 April 18 11:30
- 50 min
- Liberty I
- PostgresConf US 2018