Capgemini’s Data Transformation offering accelerates data migration to AWS
Capgemini’s Data Transformation offering provides a suite of tools and accelerators catering to all the phases of migrating data and data-tier logic from a variety of on-premises technologies such as Teradata, Netezza, Hadoop, and RDBMS, to relevant AWS services such as Amazon Redshift, Amazon Elastic MapReduce (Amazon EMR), Amazon Simple Storage Service (Amazon S3), and Amazon Relational Database Service (Amazon RDS), as well as third party solutions like Snowflake on AWS.
The offering uses Capgemini's Leap Data Transformation Framework that includes solutions accelerators for data discovery, data migration, and management phases. Within the discovery phase, there are accelerators for discovering data assets, categorizing objects, analyzing access patterns, and applying statistical algorithms. For the migration phase, there are pre-built accelerators that automate schema and script conversion, transform ETL pipeline, migrate SQL-based source to cloud DBMS objects, and automate the testing of database objects, script, and data. Additionally, the management phase provides integrated DevOps solution accelerators that build, test, and deploy code faster as well as provide overall code pipeline monitoring assistance.
Capgemini is an APN Premier Consulting Partner and has achieved AWS Financial Services Competency. Competency Partners have demonstrated technical proficiency and proven customer success.
Capgemini’s Data Transformation offering provides a suite of tools for data architects and developers to automate data migration from on-premises to AWS. The automated tools eliminate the risk of error in data migration due to manual interventions and reduces the effort, time, and cost of migrating data to AWS.
The offering allows customers to:
- Discover data assets, categorize objects, analyze access patterns, and apply statistical algorithms
- Convert schema and scripts from Teradata and other sources to target models
- Transform ETL pipelines to convert the ETL source-target mapping to Spark code with template based automated transformation of ETL scripts
- Convert SQL based source to cloud DBMS objects
- Automate testing of database objects, scripts, and data
- Build, test, and deploy code faster and monitor overall code pipeline using integrated DevOps solution accelerators