AWS Storage Blog

Increasing agility and reducing costs with AWS Storage during the COVID-19 pandemic

The COVID-19 pandemic upended the way of life for countries, organizations, communities, families, and the people that drive the global economy. Most of the world has shifted to remote work and school, and a different way of life with essential workers on the forefront. We see industries dealing with the challenges of slowed growth as they seek to minimize discretionary expenditures, preserve cash, and pause major projects or investments while optimizing and automating variable spending (such as cloud computing). These organizations must significantly lower storage costs, and specifically they must avoid additional storage hardware refresh purchases. We also see other organizations – like in the video conferencing, remote learning, and telehealth industries – dealing with massive increases in demand that they must be able to scale to support almost overnight. These customers need the agility to scale storage up and down, as their business dynamics change with the pandemic. AWS storage has millions of customers across the globe, of all sizes, and in industries experiencing the large-scale changes to their operations, budgets, and revenue due to the pandemic.

The uncertainty in the current environment can cause hesitation to move forward, but now is not the time to get overly conservative. It is even more critical than ever to make the right storage decisions to save on costs, increase agility, and lower your overall TCO. It is important to recognize that data drives informed decisions and business insights. Data provides you with the agility to drive your core mission. That doesn’t change in today’s world — if anything, it’s even more true in uncertain times. On AWS, organizations no longer need to predict future storage capacity, as they are adapting to the pandemic-induced disruption by leveraging the operational agility and economic advantages of AWS’s consumption-based model. According to IDC, enterprises expect 30–40% data growth annually, and by 2023 more than 40% of the world’s data will be stored in hyperscale/cloud data centers.[1] Data growth can’t be stalled or suppressed, it acts as a compass for what will happen in the future, and the best way to navigate the uncertainty is to shift to cloud storage on AWS. AWS Storage solutions help you optimize storage costs by giving you the agility to select from a portfolio of services and pay for the value you derive from your data usage. You are facing unique challenges during this time, but by optimizing storage costs, you can free up resources from managing storage infrastructure, to focus on your current priorities.

Our customers represent many industries and stages of growth, which gives us a unique perspective to see common issues, solutions, and cost optimizations in this time of disruption. In this blog post, I wanted to share insights, learnings, and recommendations directly from these customers. The testimonials presented can help you move forward in increasing your business’s agility by moving storage to AWS, or realizing cost savings by undertaking similar cost optimizations. We hope that these observations help your company address the data challenges of today, and prepare for the future. Here’s what we learned…

Increased agility through AWS Storage

Nasdaq is a multinational financial services and technology corporation that owns and operates the Nasdaq Stock Market. Nasdaq built a data lake on Amazon S3, which enables the company to separate compute and storage, and is using Amazon Redshift Spectrum, an Amazon Redshift feature that powers a lake house architecture to query data both in the data warehouse and the S3 data lake. Nasdaq can now flex its compute layer to support the volume of transactions, with the data lake built on Amazon S3 storage, easily supporting data that continues to grow in volume and complexity. For example, market volatility spiked in late February 2020, at the beginning of the COVID-19 pandemic, and the solution scaled to support ingestion of 70 billion records daily—with a peak volume of 113 billion.

“We were able to easily support the jump from 30 billion records to 70 billion records a day because of the flexibility and scalability of Amazon S3 and Amazon Redshift.” – Robert Hunt, vice president of software engineering for Nasdaq.

Nasdaq can easily and quickly scale its environment down to ensure that there is no idle capacity when the market adjusts again. Additionally, Nasdaq uses Amazon S3 to store critical financial data and uses S3 Lifecycle policies to transition the data to S3 Glacier, where it can be archived at a lower cost.

 

Embark is building self-driving truck technology to make roads safer and improve the efficiency of transportation. When the COVID-19 pandemic hit, Embark made a decision to pause their truck operations in order to align with social responsibility to public health and ensure the safety of their workforce. This had an immediate effect on Embark’s engineering development, with teams no longer able to leverage on-the-road testing as a means to gather more data and measure performance. Embark’s response was to turn to their petabytes of historical data on Amazon S3 and develop systems to allow them to leverage these more deeply. Engineers began pulling from years of historical data, pouring through thousands of hours of driving data to find scenarios of interest, and using this data to build stronger simulations against which they could test their system. With all of Embark’s data stored using the S3 Intelligent-Tiering storage class, Embark didn’t have to spend time thinking about which data should be available and how to move this data between different storage tiers in order to optimize costs while still enabling this sudden pattern of random data access into their data lake. S3 Intelligent-Tiering did all of the work of optimizing costs for them so that their team could focus all of their engineering efforts on building better data pipelines and simulation systems. With the help of AWS, Embark’s team was able to quickly adapt to the challenges of the pandemic and when the pause was lifted, they were able to continue their focus on delivering the safety and efficiency benefits of self-driving trucks.

 

Experian created an award-winning data analysis tool for the financial industry that grew revenue for the company 540 percent—from $50 million in 2018 to more than $270 million in just the first quarter of 2020—and reduced production cycle from years to under 90 days, all by using AWS tools, especially Amazon EBS. The credit reporting company used a highly sophisticated and interconnected solution of Amazon EBS and other AWS services for managing big data, top-level security, and computing power to not only quickly launch the analytics platform Ascend, which adjusts to customers’ needs, but also quickly deploy new products on the platform. Using the Ascend platform, Experian was able to roll out a COVID-19 pandemic financial data tool in just 30 days.

“Amazon EBS enabled us to add more data and quickly expand the volume of the data. That is a huge game changer for us in how we react to market conditions, especially during COVID.” – Moied Wahid, vice president of Ascend platform engineering, Experian

 

Splunk turns “data into doing” with the Data-to-Everything Platform. Splunk technology is designed to investigate, monitor, and analyze and act on data at any scale.

“Splunk uses Amazon EBS to deliver high-performance storage for Splunk Cloud service. Amazon EBS keeps our customers’ data secure and highly available, which has grown increasingly important in recent months as we have seen a spike in customer traffic. Splunk has been building hundreds of EBS volumes a day, while running tens of thousands of volumes concurrently. One of our new solutions called Splunk Remote Work Insights – launched to help ease uncertainty and tackle challenges presented by a rapidly growing remote workforce – is backed by EBS. We easily scaled both of these AWS services to meet the increased demand. Splunk also uses EBS as the backbone for our trial offering allowing customers to try Splunk Cloud in a quick and easy manner. We run over 11 PB of EBS and have no issues scaling globally across all AWS Regions. Also, EBS snapshots are a key component of our overall backup and resiliency strategy. EBS meets the scale we need to run our business, especially as we see spikes in our workloads.” –

– Sendur Sellakumar, SVP, Cloud and Chief Product Officer

 

 

Druva offers a SaaS platform for data protection across data centers, cloud applications, and endpoints.

“2020 has become an inflection point for many of our customers who took an action and migrated to the cloud in a matter of days. Our customers have seen several spikes in data that needs to be protected and we were able to scale up our management of EBS to support the surge. Our block storage backup in 2020 grew by 50% as compared to 2019. Druva helps customers use many EBS capabilities such as Snapshots and Cross Region/Account encrypted Snapshots to improve backup, restore and DR. We love the choice of EBS volumes that helps our customers meet their price-performance requirements and support their increasing demand for cloud-based solutions.”

 

BandLab, a collaborative social music service, decreased costs by switching to Amazon S3, while simultaneously improving the durability and accessibility of more than 500 million audio files.

During the COVID-19 pandemic, BandLab’s storage solution using Amazon S3 rose to the challenge when the service saw a sharp increase in users and content creation. The total number of users in December 2019 was 12 million. That number grew by 1 million per month from January to May 2020 and content creation neared 10 million tracks a month. By November 2020, it was up to 25 million users. BandLab was able to seamlessly scale to meet demand without sacrificing user experience.

At AWS we were ready to help customers scale their storage to meet overnight demand for services that no one could have predicted, and we did not have to restrict access or limit services to any customer around the world.

Optimizing storage costs to accelerate innovation

Ontario Telemedicine Network (OTN) is a virtual care provider of products and services to patients and physicians for the province of Ontario and provides support for several other provinces. OTN realized their on-premises storage infrastructure would not scale, experienced pain points from overnight deployments, unexpected outages and downtime, and they wanted to focus on creating value by moving to Amazon S3 and Amazon EBS. OTN responded to COVID-19 by quickly scaling their virtual healthcare solutions on AWS to handle an exponential increase in demand for services as they went from daily averages of 3,800 events and 8,000 participants to 12,000 events to 30,000 participants, almost overnight.

“Amazon EBS and Amazon ElastiCache allowed OTN to leverage quick and ever-expanding storage for OTN’s training materials and Learning Management System. During the pandemic, the amount of training material and training increased as new patients and providers sought to use the various telehealth and telemedicine programs. Amazon S3 allows OTN to build a modern real-time analytics platform to allow for the improvement of the service based on usage patterns of the provincial video network, and storage classes like S3 Standard-Infrequent Access, and S3 Glacier provide an infinite and low-cost solution to back up OTN’s on-premises data. This supports a robust business continuity and disaster recovery process. OTN is continuing our full transition to AWS by building a modern real-time analytics platform on Amazon S3 and AWS ElasticSearch to provide real-time analytics of our applications as well as a future data lake to bring a modern approach to business intelligence.” – Alex Reidiboim, Lead Solution Architect, OTN Division of Ontario Health.

 

BallerTV is on a mission to connect families and communities everywhere through the unifying power of live sports. By live streaming youth sporting events across the country, they have brought families closer together and give youth athletes a platform to showcase their skills and help with college recruitment. The magic of BallerTV also means they save every single game ever broadcasted on Amazon S3 — so whether you’re an athlete who wants to review game film, or a weekend warrior who wants to relive the glory days, BallerTV’s footage will be here. When COVID-19 hit, youth sports were put on hold, and thousands of hard-working members of the community, and millions of youth athletes across the country stayed home to help flatten the curve of the virus. To make sure BallerTV stayed on its feet, they optimized storage costs using the S3 Intelligent-Tiering and S3 Glacier storage classes which reduced monthly storage costs by 57%. Even if athletes weren’t allowed to play games, their game footage wasn’t going anywhere. It’s still on BallerTV, thanks in large part to the reliability and proactive cost-saving measures they were able to implement using Amazon S3.

 

G-Research is Europe’s leading quantitative finance research firm. G-Research is a big data company that uses algorithms, machine learning, artificial intelligence, and some of the most advanced technology in the world to predict movements in financial markets and discover inefficiencies.

“Building machine learning models often requires vast amounts of data. We leveraged AWS Transfer for SFTP to build a solution that makes it simple to receive large amounts of data from a broad array of feeds and sources. The ability to receive data easily, brings new data opportunities. G-Research has many datasets that are terabytes in size and are processed and manipulated after they land in Amazon S3. In order to cost-effectively manage these datasets, we monitor data on S3 that is not frequently accessed and use an S3 Lifecycle policy to transition data among S3 storage classes. Data that is accessed only once every month or even once every quarter is transitioned to S3 Glacier, and data that is accessed once or twice per year is transitioned to S3 Glacier Deep Archive to further optimize costs. We then use S3 Inventory and AWS Glue to quickly and easily identify files for retrieval. This allows the business to focus more of our finances and effort on producing efficient machine learning models without additional storage cost overhead.” – Adnan Rashid, Cloud Engineer – G-Research

 

Direct Supply is an employee-owned company that specializes in providing equipment, eCommerce, and service solutions to the senior living industry.

“Leveraging AWS Storage Gateway’s File Gateway we are able to store and access objects in Amazon S3 from our NFS and SMB file-based applications with local caching, which helped Direct Supply reduce our costs to store large datasets. The consumption-based pricing model for Amazon S3 means we’re not paying for storage that we’re not using. Asset tagging has streamlined our ability to introduce a showback/chargeback model to our business leaders. Direct Supply also has a substantial presence in Amazon WorkSpaces, which allowed us to move more than half of our employees to working from home in a matter of hours without having to invest in any additional hardware or bandwidth. The agility to keep our staff safe and productive during the current health crisis allowed us to stay focused on delivering outrageous support to our customers.”

– Dave Stauffacher, Chief Platform Engineer – Direct Supply, Inc

Take the next step

These are just a few examples of how customers have used AWS Storage during the COVID-19 pandemic to reduce cost and increase agility. We hope hearing from customers similar to you, as we all face this pandemic, will help you do the same. By taking action now, you can stay agile and gain immediate efficiencies by optimizing costs, and build resiliency and scale for the long term. At AWS, we appreciate all of our customers, and the AWS Storage team is always here to help. To help you move your storage to AWS faster, learn about the services, best practices, and tools in the AWS Migration Acceleration Program (MAP) for Storage program. If you have any comments or questions, please don’t hesitate to leave them in the comments section.

[1] IDC Market Spotlight, Sponsored by AWS, Cloud Storage Adoption: From Cost Optimization to Agility and Innovation, Doc. #US46772420 September 2020.

Sean White

Sean White

Sean White is the Amazon S3 product marketing manager at AWS. He enjoys writing about the innovative ways customers use Amazon S3, launching new features and innovations that that delight customers, and simplifying the complex. He is based in Boston, loves spending time with his wife and two kids, watching sports, and exploring craft breweries.