Cirata Expands Data Interoperability Support for Apache Iceberg and Databricks Delta Lake with Data Migrator 3.0


Major new version delivers multicloud data interoperability, automation, scalability and security.
Major new version delivers multicloud data interoperability, automation, scalability and security.
London. December 3, 2024 – Cirata, the company that automates Hadoop data transfer and integration to modern cloud analytics and AI platforms, today announced the launch of Data Migrator 3.0, with production-ready support for Apache Iceberg, expanded capabilities for Databricks Delta Lake, significant enterprise security enhancements and comprehensive extensibility. This major new version of Cirata’s technology empowers organizations to adopt open data architectures while improving scalability, automation, and security for their multicloud data management and analytics needs.
“Organizations understand the need to avoid the lock-in, technology constraints and costs that come with isolated data storage and formats. They need solutions that make it possible to use their largest data assets wherever they are created, in a variety of formats, and with whatever future technologies emerge. They want data interoperability,” said Paul Scott-Murphy, Chief Technology Officer at Cirata. “Our production-ready, direct support for open table formats like Apache Iceberg and Delta Lake eliminates the constraints of closed data architectures, even if you have petabytes of data held in formats or locations that previously required lengthy, complex and risky efforts to modernize. Data Migrator 3.0 is a significant advancement for organizations that want to future-proof their data management, analytics and AI strategies.
Cirata Data Migrator 3.0 introduces a range of new features that streamline data transfer between on-premises and cloud environments, reduce operational complexity, and increase data interoperability. Optimized for handling even the largest datasets, Data Migrator 3.0 removes computational overhead, offering improved performance and flexibility for enterprise-grade data transfers at scale. The improvements are particularly significant for enterprises adopting Apache Iceberg and Databricks Delta Lake within their data ecosystems. Key new features include:
Cirata Data Migrator is a fully automated solution that automates Hadoop data transfer and integration and moves on-premises HDFS data, Hive metadata, local filesystem, or cloud data sources to any cloud or on-premises environment, even while those datasets are under active change. Cirata Data Migrator requires zero changes to applications or business operations and moves data of any scale without production system downtime, business disruption, and with zero risk of data loss. Migration targets supported include the Hadoop Distributed File System, Alibaba Cloud Object Storage Service, Amazon S3, Azure Data Lake Storage Gen 2, Google Cloud Storage, IBM Cloud Object Storage and Oracle Object Storage.
Cirata Data Migrator 3.0, including its new features, is now generally available. For more information, visit https://cirata.com/data-management/products/data-migrator.
About Cirata
Cirata, accelerates data-driven revenue growth by automating Hadoop data transfer and integration to modern cloud analytics and AI platforms without downtime or disruption. With Cirata, data leaders can leverage the power of AI and analytics across their entire enterprise data estate to freely choose analytics technologies, avoid vendor, platform, or cloud lock-in while making AI and analytics faster, cheaper, and more flexible. Cirata’s portfolio of products and technology solutions make strategic adoption of modern data analytics efficient and automated. For more information about Cirata, visit www.cirata.com
Data interoperability refers to the ability of different systems and organizations to exchange and use data seamlessly, ensuring that information can be shared and understood across various platforms.
Apache Iceberg is an open table format for large analytic datasets, designed to improve the performance and manageability of data in cloud environments.
Databricks Delta Lake is an open-source storage layer that brings reliability to data lakes, allowing for ACID transactions and scalable metadata handling.
Enterprise security encompasses the strategies and technologies used to protect an organization's data, networks, and systems from unauthorized access or attacks.
Data automation involves using technology to perform data-related tasks automatically, reducing the need for manual intervention and increasing efficiency.
Explore more articles in the Technology category











