Data lakes and data warehouses have been popular storage choices for years, often working in tandem to better serve enterprises. Where data lakes store raw data in its original form, data warehouses are used for organized data sets. Yet the constant exchange of data between the two silos has led to increased complexity, as well as a greater risk of duplication and data inconsistency. As workloads become increasingly data-heavy, organizations have been in need of a more comprehensive and cost-effective solution.
Is that solution the data lakehouse? According to a recent Dremio survey of enterprise IT professionals, data lakehouses are now the primary architecture for delivering analytics, with 65% running a majority of analytics on lakehouses. Additionally, 81% are using a data lakehouse to support work on AI models and applications, which is a growing area of investment for many orgs.
Tune into this episode of Cloud Cover to hear host Jo Peterson and her expert guests dig into the real value of the data lakehouse. As we move into the era of AI, we’ll explore whether data lakehouses are critical component to:
— Improving AI-driven data management, governance and compliance.
— Increase the ability to data self-service.
— Build and improve AI models and applications.
— Provide more automation and AI-assisted data management.