šŸ¤–Building a Telegram bot with Apache Kafka and ksqlDB @rmoff #NDCSydney Robin Moffatt

Where’s my nearest carpark with available spaces?

How many spaces are available in this car park?

šŸ’”Tell me when a car park with spaces is available

šŸ“ˆHow does occupancy vary over time?

$ whoami > Robin Moffatt (@rmoff) > Senior Developer Advocate at Confluent (Apache Kafka, not Wikis šŸ˜‰) > Working in data & analytics since 2001 > Oracle ACE Director (Alumnus) http://rmoff.dev/talks Ā· http://rmoff.dev/blog Ā· http://rmoff.dev/youtube @rmoff | #NDCSydney

Telegram @rmoff | #NDCSydney

Don’t just tell me… show me! Demo code: https://rmoff.dev/carparks

carparks HTTP Kafka @rmoff | #NDCSydney

What are the key pieces of the design? @rmoff | #NDCSydney

Event Driven Alerts carparks HTTP Kafka @rmoff | #NDCSydney

K/V Lookups (materialised views) SELECT SPACES_AVAILABLE How many spaces are free at ā€œ FROM CARPARK WHERE NAME=’WESTGATE’; Westgate carpark right now? ksqlDB ā€ Kafka CARPARK_EVENTS 42 CREATE TABLE CARPARK AS SELECT LATEST(… GROUP BY NAME ā€œ There are 42 spaces free @rmoff | ā€ #NDCSydney

A schema… carparks HTTP @rmoff | #NDCSydney

A schema… 2020-10-14,12:28,Broadway,1132,921 2020-10-14,12:28,Kirkgate Centre,611,474 2020-10-14,12:28,Sharpe Street,98,63 ?! @rmoff | #NDCSydney

My kingdom for a schema! 2020-10-14,12:28,Broadway,1132,921 2020-10-14,12:28,Kirkgate Centre,611,474 2020-10-14,12:28,Sharpe Street,98,63 šŸ˜ { ā€œtsā€: ā€œ2020-10-14T12:28 UTC+1ā€, ā€œnameā€: ā€œBroadwayā€, ā€œcapacityā€: 1132, ā€œemptyā€: 921 } … @rmoff | #NDCSydney

Applying a schema to streams of data source_topic ksqlDB CREATE STREAM mySource (date VARCHAR , time VARCHAR , name VARCHAR , capacity INT ) WITH (KAFKA_TOPIC=’source_topic’, VALUE_FORMAT=’DELIMITED’); Kafka @rmoff | #NDCSydney

Applying a schema to streams of data source_topic ksqlDB Kafka derived_topic CREATE STREAM mySource (date VARCHAR , time VARCHAR , name VARCHAR , capacity INT ) WITH (KAFKA_TOPIC=’source_topic’, VALUE_FORMAT=’DELIMITED’); CREATE STREAM myTargetStream WITH (VALUE_FORMAT=’PROTOBUF’, KAFKA_TOPIC=’derived_topic’) AS SELECT * FROM mySource; @rmoff | #NDCSydney

Integration carparks HTTP Kafka @rmoff | #NDCSydney

Streaming Integration with Kafka Connect syslog Sources Tasks Workers Kafka Connect Kafka Brokers @rmoff | #NDCSydney

Streaming Integration with Kafka Connect Amazon S3 Google BigQuery Sinks Tasks Workers Kafka Connect Kafka Brokers @rmoff | #NDCSydney

Streaming Integration with Kafka Connect Amazon S3 syslog Google BigQuery Tasks Workers Kafka Connect Kafka Brokers @rmoff | #NDCSydney

Streaming Analytics @rmoff | #NDCSydney

Why build it this way? @rmoff | #NDCSydney

Events @rmoff | #NDCSydney

Streams of Events @rmoff | #NDCSydney

We want to react to them as they happen @rmoff | #NDCSydney

We want to build state from a stream of events @rmoff | #NDCSydney

We want to provide the latest data in our analytics @rmoff | #NDCSydney

Apache Kafka - an Event Streaming Platform Producer Connectors Consumer The Log Connectors Streaming Engine @rmoff | #NDCSydney

Why Kafka? @rmoff | #NDCSydney

Distributed, Immutable, Event Log New Old Events are added at the end of the log @rmoff | #NDCSydney

Consumers can seek to any point Read to offset & scan New Old @rmoff | #NDCSydney

Data is not deleted once read New Old Sally is here Scan @rmoff | #NDCSydney

Consumers are independent of each other New Old Fred is here Scan Sally is here Scan @rmoff | #NDCSydney

Consumers can be added later Rick is here Scan New Old Fred is here Scan Sally is here Scan @rmoff | #NDCSydney

Stream Processing with ksqlDB Source stream @rmoff | #NDCSydney

Stream Processing with ksqlDB Source stream @rmoff | #NDCSydney

Stream Processing with ksqlDB Source stream @rmoff | #NDCSydney

Stream Processing with ksqlDB Source stream Analytics @rmoff | #NDCSydney

Stream Processing with ksqlDB Source stream Applications / Microservices @rmoff | #NDCSydney

Stream Processing with ksqlDB …SUM(TXN_AMT) GROUP BY AC_ID AC _I D= 42 BA LA NC AC E= _I 94 D= .0 42 0 Source stream Applications / Microservices @rmoff | #NDCSydney

Under the covers of ksqlDB @rmoff | #NDCSydney Photo by on

Kafka cluster consume produce ksqlDB @rmoff | #NDCSydney

JVM Kafka cluster consume produce ksqlDB Kafka Streams @rmoff RocksDB | #NDCSydney

k & ^ Kafka Fully Managed as a Service B D l q s

Running ksqlDB - self-managed DEB, RPM, ZIP, TAR downloads http://confluent.io/download Docker images ksqlDB Server confluentinc/ksqldb-server (JVM process) …and many more… @rmoff | #NDCSydney

Why Kafka? @rmoff | #NDCSydney

Stream Store Process Integrate @rmoff | #NDCSydney

Stream Store Process Integrate @rmoff | #NDCSydney

Stream Store Process Integrate @rmoff | #NDCSydney

Stream Store Process Integrate @rmoff | #NDCSydney

Stream Store Process Integrate @rmoff | #NDCSydney

Flexible, event-driven applications Event-driven alerts Key/Value lookups Streaming ETL @rmoff | #NDCSydney

on Photo by Want to learn more? CTAs, not CATs (sorry, not sorry) @rmoff | #NDCSydney

Try it out for yourself https://rmoff.dev/carparks

60 DE VA DV $200 USD off your bill each calendar month for the first three months when you sign up https://rmoff.dev/ccloud Free money! (additional $60 towards your bill šŸ˜„ ) Fully Managed Kafka as a Service * T&C: https://www.confluent.io/confluent-cloud-promo-disclaimer

Learn Kafka. Start building with Apache Kafka at Confluent Developer. developer.confluent.io

Confluent Community Slack group cnfl.io/slack @rmoff | #NDCSydney

Further reading / watching https://rmoff.dev/kafka-talks @rmoff | #NDCSydney

#EOF https://talks.rmoff.net @rmoff