A presentation at Berlin Buzzwords by Robin Moffatt
Companies new and old are all recognising the importance of a low-latency, scalable, fault-tolerant data backbone, in the form of the Apache Kafka streaming platform. With Kafka, developers can integrate multiple sources and systems, which enables low latency analytics, event-driven architectures and the population of multiple downstream systems.
In this talk, we’ll look at one of the most common integration requirements - connecting databases to Kafka. We’ll consider the concept that all data is a stream of events, including that residing within a database. We’ll look at why we’d want to stream data from a database, including driving applications in Kafka from events upstream. We’ll discuss the different methods for connecting databases to Kafka, and the pros and cons of each. Techniques including Change-Data-Capture (CDC) and Kafka Connect will be covered, as well as an exploration of the power of ksqlDB for performing transformations such as joins on the inbound data.
Attendees of this talk will learn:
The following resources were mentioned during the presentation or are useful additional information.
Fully Managed Apache Kafka, Schema Registry, ksqlDB, and Connectors.
RMOFF200
to get an additional $200 of free Confluent Cloud usage (T&C)Here’s what was said about this presentation on social media.