AWS Cloud Enterprise Strategy Blog

Database Freedom Is Real: The Autodesk Story

As a previous customer and current enterprise strategist for Amazon Web Services (AWS), I’ve seen that although no two companies are the same, IT challenges are extremely similar across the industry spectrums.

One of those challenges I faced in my past roles was the sheer amount of time, cost, and frustration of managing database software licensing, which—despite near heroic efforts to keep correct—would often result in the dreaded “audit notification letter” from legacy software database vendors. This letter was of zero value to my role. Sadly, it’s a classic supplier-first strategy, which customers I speak with today are rightly tired of. Thankfully, in 2020 the database choices and pay-as-you-go models have moved to be customer first and customer obsessed.

Nowadays, we can choose from relational, key-value, document, in-memory, graph, time-series, and ledger databases, all of which are designed to scale massively, be secure from the ground up, and—perhaps most important to customers—have a “pay for what you use” model. How things have changed in the last ten years. The concept of “database freedom” that has emerged is always a very topical debate in customer meetings I have. In particular with many on-premises incumbent workloads running relational databases using Oracle or Microsoft SQL, the ability to migrate these workloads to the Amazon Aurora service, with its open source PostgreSQL and MySQL engines, is extremely attractive.

My teammate Joe Chung and I had the privilege of sharing our personal migrations experiences on stage with Nike at re:Invent in 2018. We covered a number of hot areas, including the use of the AWS Database Migration Service and AWS Schema Conversion Tool. Migration conversion can cover a significant part of the time I spend with customers. And one of the most interesting questions, that of zero downtime migrations of data warehouses, can now be a reality, as this blog post “How to migrate a large data warehouse from IBM Netezza to Amazon Redshift with no downtime” shows.

However, I would like to pause here and hand the proverbial reins over to Reddy Srinivas Chilveru, who is the senior manager of cloud management and database services at Autodesk. Autodesk is headquartered in California and has over 10,000 employees worldwide, over $3.2B in revenue, and a subscription base of 4.9 million users globally.

Jonathan
Twitter | LinkedIn | Blogs

Reception area at Autodesk Singapore

In the Words of Reddy Srinivas Chilveru,
Senior Manager of Cloud Management and Database Services,  Autodesk

My team’s goal is to work with developers, architects, and application owners to build a lasting database strategy. I am a senior manager of cloud management and database services at Autodesk, located in our Singapore office. My team is also responsible for Autodesk cloud management services, including multi-cloud account provision, consulting the cloud community on AWS services, and ensuring that cloud services are used within compliance guidelines. Our team enables subscribers to use Autodesk software with the confidence of a strong database infrastructure.

Just as AWS is for builders, Autodesk makes software for people who make things, helping people turn ideas into new realities that shape the future. If you’ve ever driven a high-performance car, admired a towering skyscraper, used a smartphone, or watched a great film, chances are you’ve experienced what Autodesk customers create using Autodesk’s design, make, and build software. Over thirty-five years ago, Autodesk started making desktop software aimed at the architecture, construction, engineering, manufacturing, and media and entertainment industries. Autodesk evolved to provide software services hosted in data centers. To streamline development and time to market, about seven years ago Autodesk embarked on our journey to the cloud, expanding our use of Amazon Web Services in new workloads, and in 2018 we started to decrease our data center footprint.

Autodesk has had a long-term goal to divest from data centers and embrace the cloud for its scalability, flexibility, reliability, and security to support its new subscription business model. As part of that long-term goal, our CIO, Prakash Kota, asked my team (Data Services) to analyze what that means to its data center–hosted commercial database, both in terms of scalability and cost.

To give our customers the power to work more quickly, effectively, and sustainably throughout their entire project lifecycles, we knew we needed to reimagine our infrastructure to support this complex, exponentially growing data. We’ve noticed that the traditional database landscape and data center setups were no longer meeting our needs. With my team, we determined that we needed to explore open-source, cloud-agnostic solutions.

The traditional and commercial databases that we were implementing until 2017 were no longer easy to manage because of exponential growth expected due to new business models and database-unfriendly cloud licensing practices. Running commercial databases would have doubled our support and licensing costs. A decade ago, commercial databases like Oracle and Microsoft SQL were the default type of databases that we used to provision for any enterprise or non-enterprise applications. Cloud and open-source databases were not matured enough or did not exist back then, but now the game has changed and there are many options available.

With Autodesk’s transition to a subscription model five years ago, we’ve seen more than 12x data increase and a similar increase in compute capacity to process that data. Our analyses quickly showed that moving to the cloud was the only option to support Autodesk’s new business model. The cloud has allowed us to store more data for customer insights and see product improvements. But, even more, it gave us the opportunity to think about a new, fit-for-purpose database strategy.

As a leader of the team, I looked at team members’ skill sets and immediately knew that we needed some external help in our initial assessment to ensure that our thought process was appropriate. I try to actively assess and stress test my ideas to know if my planned approach is the right one. In parallel, we planned to upskill the team through AWS trainings. Before we started this journey, none of my team members were trained and certified, and 95% of our database footprint was in data centers. We partnered with AWS for an initial assessment and feasibility study, which gave us the confidence that we were on the right path.

To move the migration project forward, we followed a five-step process to methodically approach the challenge at hand:

Step 1: We first went through a Discovery process to understand our database inventory and usage. This was done through a combination of automated and manual efforts.

Step 2: Once we had a full understanding of the scope of our databases, we tracked database and application usage to identify Dormant databases. This was critical, given that dormant databases could become potential security and GDPR risks. This was done through automation, and we used a few third-party tools, including SolarWinds DPA and Denodo.

Step 3: We collaborated with application owners to Divest the dormant applications and databases. My team worked closely with customers and assured them that if they needed applications or databases reinstated, they could be provisioned in less than twenty-four hours.

Step 4: For the data that was still current and needed, we transferred on-prem database platforms to cloud-agnostic databases like MySQL and PgSQL through coordinated Data Migrations. Typical migrations took about two to three weeks. Databases that were small and had a smaller use base would take about a week, while others took as many as three to six months. The AWS Schema Conversion Tool (SCT) and AWS Database Migration Service (DMS) servers were very useful. These tools were sufficient for most migrations.

Step 5: We approached the data migrations through Downtime-zero when feasible. We took downtime if an application could afford one, such as applications that are not production nor enterprise. For all others, we kept source and target database in sync with AWS DMS service and broke the sync gracefully during cut-over windows.

It took us three years to complete this project. The first nine months were spent on assessments (steps one through three) and the following two-plus years on actual migrations (steps four and five). It was not possible without my team members Aman Kharbanda and Shabeer Ahmed, experienced database administrators who embraced AWS technologies in a very short period of time. As we moved this project forward, I developed an appreciation of how much coordination was needed from top-down leadership alignment for teams to see how this project would bring value to our customers. Aman and Shabeer helped me communicate that this was not simply a cost savings effort, it was a modernization effort and true improvement of our customer experience.

By migrating from commercial databases, Autodesk has saved 75% on support renewals cost. We have modernized our data centers and technology platform by reducing complexity through consolidation, optimization, automation, and adopting cloud-agnostic databases. We’ve seen at least a 30% improvement in performance, a 90% reduction in provisioning time, and above all a transformed team who enabled this modernization.

Jonathan Allen

Jonathan Allen

Jonathan joined AWS as an Enterprise Strategist & Evangelist in May 2017. In this role, he works with enterprise technology executives around the globe to share experiences and strategies for how the Cloud can help them increase speed and agility while devoting more of their resources to their customers. Prior to joining AWS, Jonathan was Chief Technology Officer and Senior Director in Capital One Banks UK division. Jonathan was part of the banks Global Technology Leadership team that selected AWS as their Predominant Cloud Partner in 2014, and was accountable for architecting, engineering and execution of the technical build out and system migrations of the banks AWS Cloud strategy in partnership with the US divisions until 2017, by which time the all development had moved Cloud First. Jonathan managed a global team and held all budgetary responsibility for the technology operations and strategy execution, adoption of agile only, technical talent transformation and recruitment and creation of the banks Cloud Governance framework. During Jonathan's 17 years at Capital One he also led large scale transformations including the roll out of regulatory compliance, move from outsourcing to out-tasking, engagement with AWS Cloud Partners, adoption of DevOps at scale and the focus of an engineering led culture. In 2012, he was awarded IT Manager of the Year by The Chartered Institute for IT. He holds a Diploma in Computer Studies from Loughborough College and a CIO MBA from Boston University.