Accelerating Enterprise Data Migration with AI

Nov. 25, 2025 /Mpelembe Media/ — The document, titled “Accelerating enterprise data migration,” is a Google Cloud resource promoting the migration of corporate data platforms to their cloud environment, primarily using BigQuery and Vertex AI. The central argument is that legacy systems are inadequate for modern AI-driven success, while cloud migration provides substantial benefits like faster insights, cost efficiency, and AI readiness. Multiple case studies from diverse industries, including financial services (PayPal, DBS Bank, Intesa Sanpaolo), logistics (J.B. Hunt), and healthcare (Quest Diagnostics), illustrate how major companies modernised their infrastructure from systems like Teradata, Oracle Exadata, Hadoop, and Snowflake. These examples highlight key migration learnings, such as the importance of FinOps for cost control, strategic data cleanup, and full organizational alignment, demonstrating that Google Cloud offers an AI-enabled migration solution that simplifies and accelerates the transformation process.

Cloud migration directly addresses the limitations of legacy infrastructure for AI initiatives by providing the necessary scale, unification, performance, and integrated tools that traditional systems lack.

Here is a breakdown of how cloud migration solves specific limitations that hinder AI success, drawing from the sources:

Overcoming Limitations of Scale, Speed, and Performance

Legacy infrastructure is often criticized for failing to keep pace with the scale, speed, or flexibility needed for AI initiatives. Cloud migration resolves this through:

Scalability and Performance: Cloud platforms enable significant performance improvements with demonstrated linear scalability, eliminating the performance bottlenecks and scalability limitations faced by aging on-premises systems built on technologies like Teradata and Hadoop. One company saw performance improvement of 30% on peak workloads and linear scalability guaranteed.

Real-time Insights: Migration accelerates time-to-insights by dramatically reducing data latency. For example, data latency was reduced from 24 hours to approximately 2 hours for one logistics company, enabling fresher insights. Another financial services firm achieved 16 times fresher data after migrating, allowing them to launch personalized customer experiences with greater agility. In some cases, time-to-insight shifted from months to minutes.

Rapid Prototyping and Development: Cloud migration enables adaptable, lean teams to rapidly prototype for scale. This modern foundation simplifies AI/ML model building and deployment, allowing teams to go from prototype to production quickly. This led to a reduction in development time from months to weeks for one AI-driven platform.

Eliminating Data Silos and Fragmentation

A major obstacle for AI initiatives in legacy environments is data trapped in silos, or analytical data fragmented across multiple data warehouses and data lakes.

Data Unification: Cloud migration provides a path to a truly unified platform which eliminates the complexity and latency caused by stitching together separate data warehouses, data lakes, and AI tools. This helps organizations create a single source of truth for critical data. For instance, PayPal was able to unify 400PB of analytical data that was fragmented across various on-premises platforms.

Improved Access and Agility: Migration removes bottlenecks and breaks down silos, creating a pathway to real-time access that supports business agility and empowers users with intuitive, self-service data capabilities.

Enabling AI Readiness and Streamlined Workflows

Legacy systems often lack the AI readiness needed to leverage modern advancements. Cloud platforms are strategically chosen because they are future-ready and built for AI.

Integrated AI Ecosystem: Cloud migration facilitates seamless integration with generative AI (Gen AI) and machine learning (ML) workflows. Choosing a tightly integrated AI and data ecosystem significantly reduces friction and engineering work, accelerating AI feature development and deployment. Tools and products are often integrated (like BigQuery ML, Vertex AI, and Gemini), which simplifies the development of AI/ML infrastructure.

Accelerated Model Deployment: The unified and modern data architecture accelerates ML model development and deployment. For one logistics company, ML model retraining time was reduced from over two days to approximately one hour after migration.

Enabling New Use Cases: Migration unlocks new AI use cases and cloud-native project development that were previously limited by the outdated infrastructure.

Addressing Prohibitive Costs of Scaling Outdated Technology

The high costs of scaling outdated technology are a critical challenge associated with legacy infrastructure.

Cost Efficiency and Optimization: Migration drives AI innovation while delivering cost efficiency by modernizing legacy enterprise data warehouses (EDWs) and data lakes. It enables organizations to leverage cloud auto-scaling capabilities for cost optimization.

Strategic Debt Elimination: Migration offers an opportunity to strategically eliminate tech debt by decommissioning unused processes and assets (which can be up to 30% of database objects/daily processes), thus reducing scope and unnecessary data transfer costs and performance bottlenecks in the cloud.

Financial Accountability (FinOps): Modern cloud environments allow for granular cost visibility and enable FinOps (Financial Operations) functionality, which was often impossible on-premises. This allows companies to track consumption, hold teams accountable, and easily identify high-cost/low-value processes.

In essence, migrating from legacy infrastructure to the cloud is like moving from a fragmented, expensive, and slow private library built of old physical books to a unified, scalable digital library platform that comes pre-integrated with powerful, modern research assistants (AI tools), allowing teams to instantly access, analyze, and leverage information for fast innovation.

The migration of PayPal’s analytical data to a unified platform yielded several key results, focusing on performance, data freshness, and innovation capabilities:

Data Migration Scale: PayPal migrated 400PB of data across hundreds of thousands of tables and workloads within 18 months through enterprise-wide alignment. This data was previously fragmented across multiple on-premises platforms, including Teradata, Snowflake, Redshift, and Hadoop.

Data Freshness: The data is now over 16 times fresher after the migration.

Query Performance: PayPal saw 2.5x to 10x better query performance.

Data Duplication Reduction: There was a 99%+ reduction in data duplication.

Business Impact: The migration enabled PayPal to launch AI-based personalized customer experiences with greater scale, agility, and lower cost.

Strategic Goal: The core need and strategy were to unify 400PB of analytical data that was fragmented across multiple data warehouses and data lakes.

As Vaishali Walia, Sr. Director Engineering at PayPal, stated: “Migrating to BigQuery has been tremendous. We’ve seen huge gains in query performance, our data is over 16 times fresher, and we can now launch AI-based personalized experiences for our customers with greater scale, agility, and lower cost”.

The DBS Bank migration involved moving their existing data platform from Hadoop to BigQuery.

Key details about this migration type and approach include:

Source and Target Platform: The migration type was from Hadoop to BigQuery.

Need for Migration: DBS Bank needed to scale their costly and complex on-premises Hadoop ecosystem (which used Spark for workloads and had grown to over 6PB of data).

Architecture Type: They adopted a hybrid cloud architecture, combining their existing on-premises infrastructure (in Singapore) with a new Google Cloud platform (in Indonesia).

Strategy: To meet strict regulatory requirements, including data residency in Indonesia, DBS Bank used a parallel run strategy, operating both the on-premises platform and the new Google Cloud platform simultaneously.

Modernization Efforts: During the migration, DBS Bank modernized their data format from legacy Hive schemas to Apache Iceberg, which enables more efficient data handling, partitioning, and streamlined compaction in BigQuery.

The CNA Insurance migration involved moving their data from Oracle Exadata to BigQuery.

CNA Insurance’s strategy was to modernize their aging on-premises Enterprise Data Warehouse (EDW). This legacy system, built on Oracle Exadata, faced issues such as capacity constraints, data quality and performance problems, and difficulties with SLA compliance, with some jobs taking weeks to run.

They chose Google Cloud and BigQuery as the foundation for their data modernization journey, prioritizing the transformation of business processes and user experience rather than a simple lift-and-shift. This move established a future-ready platform optimized for generative AI (Gen AI) advancements. CNA migrated data from over 100 source applications into the new cloud environment.

Quest Diagnostics is categorized within the Healthcare and life sciences industry.

As one of America’s leading diagnostic information services companies, Quest Diagnostics serves one-third of US adults annually and 50% of US hospitals and physicians. The company manages over 80 billion patient data points, including lab, pathology, and genomics data. Their data transformation focused on preparing for the opportunities presented by generative AI and agentic AI in the healthcare sector.

Two companies have migrated data from Teradata:

PPayPal migrated its analytical data from multiple sources, including Teradata, Snowflake, Redshift, and Hadoop, to BigQuery. The goal was to unify 400PB of fragmented analytical data.

Intesa Sanpaolo migrated their legacy on-premises data service hub, which was built on Teradata and Hadoop, to BigQuery. The migration type is listed as “Teradata and Hadoop to BigQuery”. The primary migration tool for moving data platforms to BigQuery is the BigQuery Migration Service.

Key information about this tool includes:

Comprehensive Suite: The BigQuery Migration Services offers a comprehensive suite of AI-powered tools, automated processes, and expert guidance designed to simplify, accelerate, and reduce the risk associated with migrating data warehouses to Google Cloud.

Automated Code Transpilation: The service is specifically used for automated code transpilation and automatically transcribing code. For example, PayPal utilized the BigQuery Migration Service for automated code transpilation from various source platforms, including Teradata, Snowflake, Redshift, and Hadoop.

AI-Powered Translation: The service uses generative AI (Gen AI) which has learned from thousands of past projects to understand the intent of the old code and automatically translate it. This automation makes the migration process dramatically faster, more accurate, and less costly, requiring less manual effort and risk.

Optimization Integration: Companies like PayPal integrated optimization efforts directly into the migration process, working to enhance migration tools like the BigQuery Migration Service to generate optimized code upfront, which eliminated the need for later modification and re-testing.

Data Validation: PayPal also partnered with Google to enhance the Data Validation Tool alongside the BigQuery Migration Service.

The main destination and foundational product used for the data migrations is BigQuery.

In all the detailed case studies provided, BigQuery is chosen as the unified platform or central analytics platform to replace the legacy data warehouses and data lakes.

BigQuery (The Destination Product)

BigQuery serves as the primary modernized data platform or unified EDW platform for organizations migrating from complex and fragmented legacy systems.

It is described as a truly unified platform that eliminates the complexity and latency caused by stitching together separate data warehouses, data lakes, and AI tools.

It is specifically chosen by companies like PayPal, DBS Bank, CNA Insurance, Quest Diagnostics, and Intesa Sanpaolo as the core engine to modernize their data operations and enable AI.

BigQuery Migration Service (The Migration Tool)

While BigQuery is the destination product, the specific product/tool used to facilitate and accelerate the migration process itself is the BigQuery Migration Service.

The BigQuery Migration Service is described as a comprehensive suite of AI-powered tools, automated processes, and expert guidance designed to simplify, accelerate, and reduce the risk of moving data warehouses to Google Cloud.

It uses generative AI (Gen AI) to automatically translate complex legacy code, making the process faster, more accurate, and less costly than manual rewriting.

Companies like PayPal adopted the BigQuery Migration Service for automated code transpilation from different sources (Teradata, Snowflake, Redshift, and Hadoop).

Intesa Sanpaolo also relied heavily on the full suite of BigQuery Migration Services to automate and accelerate their data ecosystem migration.

The service that automates code for data migration to BigQuery is the BigQuery Migration Service.

Here are the details regarding how this service automates the code translation process:

Automated Process: The BigQuery Migration Service is a comprehensive suite of tools and automated processes designed to simplify and accelerate the migration of data warehouses.

Code Transpilation: It is specifically used for automated code transpilation and automatically transcribing code from various source platforms.

AI-Powered Translation: The service utilizes generative AI (Gen AI), which has been trained on thousands of past projects to understand the intent of the old code. This allows the service to automatically translate the complex legacy code.

Benefits of Automation: This automation makes the migration process dramatically faster, more accurate, and less costly compared to the traditional method where experts had to manually rewrite complex code by hand. It significantly reduces manual effort and risk.

Companies like PayPal adopted the BigQuery Migration Service for automated code transpilation from platforms such as Teradata, Snowflake, Redshift, and Hadoop. Intesa Sanpaolo also relied heavily on the full suite of BigQuery Migration Services to automate and accelerate their data ecosystem migration.

The service that simplifies data warehouse migration is the BigQuery Migration Service.

This service is specifically designed to make the often complex process of moving data warehouses to Google Cloud simpler, faster, and less risky.

Key characteristics showing how it simplifies the migration:

Comprehensive Suite: The BigQuery Migration Service provides a comprehensive suite of AI-powered tools, automated processes, and expert guidance.

Reduced Risk and Acceleration: It is designed to simplify, accelerate, and reduce the risk of moving data warehouses to Google Cloud.

Automated Code Translation: Historically, migrations required experts to slowly and manually rewrite complex code. The BigQuery Migration Service simplifies this by using generative AI (Gen AI). This AI learns from thousands of past projects to understand the intent of the old code and automatically translate it.

Efficiency and Cost Reduction: This automated process makes the entire migration dramatically faster, more accurate, and less costly, requiring far less manual effort and risk.

Predictability and De-risking: The service offers a proven framework that aims to eliminate surprises, making the project predictable and on-budget. It provides a de-risked and future-proof migration path.

Companies like Intesa Sanpaolo relied heavily on the full suite of BigQuery Migration Services to automate and accelerate their data ecosystem migration.

The company that migrated data from Oracle Exadata to BigQuery is CNA Insurance.

CNA Insurance migrated their aging on-premises Enterprise Data Warehouse (EDW), which was built on Oracle Exadata. The company undertook this modernization effort because the legacy system faced issues such as capacity constraints, data quality and performance problems, and difficulties meeting SLA compliance.

Several features and integrated products linked to BigQuery aid in data discovery and organization:

The most explicit features supporting data discovery and organization are data cataloging and the use of Gemini (Generative AI) for natural language interaction:

Data Cataloging (Dataplex Universal Catalog): Intesa Sanpaolo, when modernizing their platform, specifically utilized cloud technologies like Data Catalog. The products listed for their migration include Dataplex Universal Catalog alongside BigQuery.

Improved Cataloging: CNA Insurance’s migration results included improved data organization, structure, and cataloging for enhanced user experience.

Discoverability through Governance: Choreograph prioritized strong data governance, which included setting up predefined structures for discoverability.

Natural Language Interaction via Catalog and AI: Quest Diagnostics established a “Quest Data Product Catalog”. This catalog was used to enable natural language interaction with Gemini in BigQuery for simplified data access, analysis, and extraction, which dramatically accelerates the ability of business users to derive insights.

The primary BigQuery component that streamlines data preparation processes is BigQuery itself, due to its ability to handle data flexibly and natively, eliminating the need for extensive pre-processing outside the platform.

Key ways BigQuery and related integrated tools streamline data preparation include:

Native Handling of Data Formats: BigQuery’s architecture eliminates a major pain point where previous platforms required data to be “massaged” before use. By choosing BigQuery, organizations prioritize a platform that can natively ingest, aggregate, and query diverse data formats, allowing teams to start working with the data immediately and eliminating time-consuming manual processes.

Platform Unification: BigQuery functions as a truly unified platform that avoids the complexity and latency associated with stitching together separate data warehouses, data lakes, and AI tools. This unification inherently streamlines the overall Extract, Transform, Load (ETL) and data preparation pipeline.

Modernization Opportunities: During migration, organizations streamline data preparation by adopting modern standards on BigQuery, such as enhancing data management by moving from legacy Hive schemas to Apache Iceberg on BigQuery, which enables more efficient data handling, improved partitioning, and streamlined compaction.

Usage of External Processing Tools: Companies also utilize specific data processing components alongside BigQuery, such as implementing modernization with cloud technologies like PySpark.

Data Rationalization: Streamlining data preparation is also achieved through efforts like strategic data rationalization, where organizations reduce the scope of preparation by eliminating unused data items and decommissioning redundant daily data processes and database objects.

One major benefit of BigQuery is that it provides a truly unified data platform, which eliminates complexity and latency.

This unification brings several related benefits:

Simplifies Architecture: BigQuery achieves this unification by acting as a single platform, eliminating the need to stitch together separate data warehouses, data lakes, and AI tools.

Accelerates AI: This unified platform gives enterprises a direct path to scale data science and multimodal AI across all data types without having to manage disparate systems.

Performance and Efficiency: Migrating to BigQuery and Vertex AI can deliver up to four times faster performance and up to three times more cost efficiency compared to other data and AI platforms.

Other key benefits realized by companies using BigQuery include:

Improved Query Performance: PayPal saw 2.5x to 10x better query performance after migrating to BigQuery.

Faster Time-to-Insight: BigQuery migration can accelerate time-to-insights by improving data access. Quest Diagnostics, for example, achieved a reduction in time-to-insight from months to minutes.

Cost Efficiency and Modernization: BigQuery helps organizations drive AI innovation and cost efficiency by modernizing legacy Enterprise Data Warehouses (EDWs) and data lakes. MarketMind achieved significant cost-effectiveness through BigQuery’s scalable, pay-as-you-go model.

Increased Data Freshness: PayPal reported their data being over 16 times fresher after migrating to BigQuery.

Enhanced Integration: BigQuery offers seamless integration with the broader Google Cloud platform ecosystem, including generative AI and ML workflows, which is often difficult to replicate efficiently on-premises.

Download the ebook here