Data pipelines do much of the heavy lifting in organisations for integrating, transforming, and preparing the data for subsequent use in downstream systems for operational use cases. Despite being critical to the data value stream, data pipelines fundamentally haven’t evolved in the last few decades. These legacy pipelines are holding organisations back from getting value from their data as real-time streaming becomes essential.
This webinar will walk through the story of a bank that uses an Oracle database to store sensitive customer information and RabbitMQ as the message broker for credit card transaction events. Their goal is to perform real-time analysis on credit card transactions to flag fraudulent transactions and push suspicious activity flags to MongoDB Atlas, their modern cloud-native database that powers their in-app mobile notifications.
To illustrate this use case, you’ll see a live demo of:
• Confluent’s fully managed connectors for Oracle CDC Source and RabbitMQ Source to stream the data in real-time ksqlDB to merge the two data sources, generating a unified view of customers and their credit card activity and flagging fraudulent transactions
• The fully managed MongoDB Atlas sink connector to load the aggregated and transformed data into MongoDB Atlas
• Along with the live demo and customer use case, you’ll learn about the challenges with batch-based data pipelines and the benefits of streaming data pipelines to power modern data flows.