Part 2: Steps to Building a Streaming ETL Pipeline with Apache Kafka® and KSQL

Logo
Presented by

Robin Moffatt, Developer Advocate, Confluent

About this talk

In this talk, we'll build a streaming data pipeline using nothing but our bare hands, the Kafka Connect API and KSQL. We'll stream data in from MySQL, transform it with KSQL and stream it out to Elasticsearch. Options for integrating databases with Kafka using CDC and Kafka Connect will be covered as well. This is part 2 of 3 in Streaming ETL - The New Data Integration series.
Related topics:

More from this channel

Upcoming talks (4)
On-demand talks (329)
Subscribers (10637)
Confluent is building the foundational platform for data in motion. Our cloud-native offering is designed to be the intelligent connective tissue enabling real-time data, from multiple sources, to constantly stream across the organisation. With Confluent, organisations can create a central nervous system to innovate and win in a digital-first world.