Five Factors to Consider When Migrating from Hadoop to the Data Lakehouse

Presented by

Donald Farmer, Principal, TreeHive Strategy, Tony Troung, Sr. Product Marketing Manager, Dremio

About this talk

Most users of the Hadoop platform are fed up with its high cost of operational overhead and poor performance. With innovations around open source standards, like Apache Iceberg and Arrow, the data lakehouse has emerged as the destination for companies migrating off Hadoop. In this video of Gnarly Data Waves, you will learn about: - 5 key factors to consider as you migrate off Hadoop to the Data Lakehouse - Why Apache Iceberg replaced Hive metastore - Creating a unified access layer on your data lakehouse with Dremio
Related topics:

More from this channel

Upcoming talks (0)
On-demand talks (103)
Subscribers (4471)
Dremio is the easy and open data lakehouse, providing self-service analytics with data warehouse functionality and data lake flexibility across all of your data. Dremio increases agility with a revolutionary data-as-code approach that enables Git-like data experimentation, version control, and governance.