Agencies are grappling with a growing challenge of distributing data across a geographically diverse set of locations around the US and globally. In order to ensure mission success, data needs to flow to all of these locations rapidly. Additionally, latency, bandwidth and reliability of communication can prove to be a challenge for agencies. A global data fabric is an emerging approach to help connect mission to data across multiple locations and deliver uniformity and consistency at scale.
This webinar will cover:
An overview of Apache Kafka and and how an event streaming platform can support your agencies mission
Considerations around handling varying quality communication links
Synchronous vs asynchronous data replication
New multi-region capabilities in Confluent Platform for Global Data Fabric