AWS Database Blog

Optimize the storage costs of your workloads with Amazon DynamoDB Standard-IA table class

You can use Amazon DynamoDB to build internet-scale applications that support user-content metadata and caches that require high concurrency and connections for millions of users and millions of requests per second using the default DynamoDB Standard table class. For example, Amazon.com uses DynamoDB to deliver consistently low latency for mission critical and extreme-scale events such as Amazon Prime Day.

As data ages, it might be accessed less often but need to remain immediately accessible. For example, in use cases such as application logs, old social media posts, e-commerce order history and past gaming achievements, older data might not be frequently accessed. For workloads that contain infrequently accessed data, you can reduce storage costs by up to 60 percent by using the DynamoDB Standard Infrequent Access (DynamoDB Standard-IA) table class.

In this post, you learn how to analyze the costs of a DynamoDB Standard table to determine if it would be more cost effective to use a Standard-IA table. You then review a use case scenario that reduces costs by storing the frequently accessed current month data in Standard tables and infrequently accessed data of previous months in individual Standard-IA tables.

Analyze the cost of your table using AWS Cost Explorer

The general guideline is to switch from Standard to Standard-IA table class when the storage cost using Standard table is approximately 50% or more of the overall monthly cost of the table. To determine whether Standard-IA is the right fit for your workload, you can use AWS Cost Explorer to analyze the storage and throughput costs.

With Cost Explorer, you can visualize, understand, and manage your AWS costs and usage. It helps you estimate your future costs and usage and you can view your data at a monthly or daily level of granularity. Tagging AWS resources allows you to add metadata, which enables you to manage, identify, organize, search, and filter resources.

Create a DynamoDB table and add a tag

To determine the costs for a DynamoDB table, you need to add a tag to it, which can be used in Cost Explorer to identify and filter a DynamoDB table. You can add a tag to your DynamoDB table at any time.

To create a DynamoDB table and add a tag

  1. On the AWS Management Console for DynamoDB, choose Tables in the navigation pane.
  2. Choose Create table.
  3. In the Tags section, choose Add new tag.
    Note: A tag has two parts: a tag key and a tag value.
  4. Enter a Key and Value to create a new tag for the DynamoDB table.

Add a tag to an existing DynamoDB table

You can create and add tags directly to an existing table at any time.

To add a tag to an existing DynamoDB table

  1. Go to the DynamoDB console and choose Tables in the navigation pane.
  2. Select the table you want to add a tag to, and then choose the Actions menu and select Add tag to selection.
  3. Select Add a new tag and enter your tag key and value, and then choose Add tag.

Note: Alternatively, you can use the Eponymous Table Tagger Tool developed by AWS to automate the creation of tags for existing DynamoDB tables. The Eponymous Table Tagger Tool automatically tags each table using the table’s name if it’s not already tagged. This tool is built using Python 3 and the AWS SDK for Python (Boto3).

Activate the tags

To track the costs in Cost Explorer, you need to activate the tags on the AWS Billing console dashboard. After enabling them, you can use them in the monthly cost allocation report to track AWS costs.

To activate the tags

  1. On the AWS Billing dashboard, choose Cost allocation tags in the navigation pane.
  2. Select the tag you want to activate and choose Activate.

After you create and apply your tags to the resources, it can take up to 24 hours for the tags to activate and appear on the Cost allocation tags page.

Retrieve the storage and throughput costs for the table

With the tags in place and activated, you can retrieve the storage costs for the table.

To retrieve the storage costs for the table

  1. On the AWS Cost Management console, choose Cost Explorer in the navigation pane.
  2. Set up the filters for Service, Region, Usage Type, and Tag. For this example, we selected DynamoDB as the service, US East (N. Virginia) as the Region, DDB: Indexed Data Storage as the usage type, and dev-dynamodb as the tag value.

To retrieve the throughput costs for the table

Let’s review the filter criteria to retrieve the throughput costs. Except for the Usage Type Group, the filter criteria are nearly the same.

  1. Set the filters for Service, Region, and Tag the same as you did to calculate storage costs in the previous step.
  2. For Usage Type Group, select DDB: Provisioned Throughput Capacity – Read and DDB: Provisioned Throughput Capacity – Write.

Now you can calculate the storage and throughput costs for the DynamoDB table and analyze whether your storage cost is approximately 50% or more of the overall monthly cost. If it does, your Standard table is a good candidate to be converted to Standard-IA table class.

Keep in mind that the throughput cost of the Standard-IA table class is 25 percent higher than the Standard table class. Therefore, the Standard-IA table class is a good fit for use cases with storage as the predominant cost and that involve less access to data. At the time of writing this post, for a Standard table in us-east-1 region with 100 GB of data, the storage cost will be $25 per month compared to a Standard-IA table, which will cost $10 per month, providing a 60 percent cost savings.

Time series use case

Now let’s assume a use case involving time series data that’s frequently accessed during the first month of its creation and is less frequently accessed over time. For example, IoT devices send time series data continuously, and the current month’s data might be more frequently accessed than the prior months’ data. For these use cases, the current month’s data can be stored in Standard table class, and the infrequently accessed data of the prior months can be stored in Standard-IA table class. The current month’s data is frequently read and written, the previous month’s data is infrequently accessed, and the data prior to that is seldom accessed. To reduce costs, at the end of every month you can create a new Standard table and change the previous month’s table class to Standard-IA. In addition to changing the table class type, you can lower the write capacity unit (WCU) and read capacity unit (RCU) limits to fit your use case. You can use the AWS Command Line Interface (AWS CLI) and AWS SDK, or the console to make all the changes related to storage class, RCU, and WCU.

Cost calculation

Let’s review the cost estimates by storage class for storage and read/write operations in the US East (N. Virginia) Region. For the following calculations, we assume 1,000 baseline and 10,000 peak reads/writes for the current month. For the previous month, the read rate is reduced to 100 baseline and 1,000 peak. For the rest of the previous 10 months, the read rate is reduced to 1 baseline and 100 peak read/writes per month. The baseline and peak writes for all the previous tables are 1 and 1 respectively, because there are no writes to those tables. The storage is assumed to be 80 GB of data every month, so all the tables have the same amount of storage.

The following cost calculation doesn’t consider if you use reserved capacity with a Standard table. The cost calculation will be different in case of reserved capacity, which isn’t covered in this post.

The following table captures the cost estimates using a combination of Standard table class and Standard-IA table class.

Read/write cost estimate
Baseline read rate Peak read rate Baseline write rate Peak write rate Monthly read cost Monthly write cost Yearly read cost Yearly write cost
DynamoDB Standard (current month) 1,000 10,000 1,000 10,000 $61.49 $614.90 $837.65 $7,417.74
DynamoDB Standard-IA (previous month) 100 1,000 1 1 $7.57 $0.59
DynamoDB Standard-IA (remainder of 10 months) 1 100 1 1 $0.30 $0.59
Storage cost estimate
Storage in GB Monthly storage cost Yearly storage cost
DynamoDB Standard 80 GB per month 20 $768
DynamoDB Standard-IA 80 GB per month 8
Total yearly cost $9,023.39

Let’s compare the cost by using only DynamoDB Standard table class tables. You can observe a cost savings of $759.59 for this specific use case when the DynamoDB Standard-IA table class is used. You can download the DynamoDB calculations worksheet to review a detailed analysis of these calculations.

Read/write cost estimate
Baseline read rate Peak read rate Baseline write rate Peak write rate Monthly read cost Monthly write cost Yearly read cost Yearly write cost
DynamoDB Standard (current month) 1,000 10,000 1,000 10,000 $61.49 $614.90 $819.28 $7415.7
DynamoDB Standard (previous month) 100 1,000 1 1 $6.15 $0.47
DynamoDB Standard (Remainder of 10 months) 1 100 1 1 $0.25 $0.47
Storage cost estimate
Storage in GB Yearly storage cost
DynamoDB Standard 80 $1,548
Total yearly cost $9,782.98

Conclusion

In this post, you’ve learned how you can use DynamoDB Standard-IA tables to reduce the cost of storing infrequently accessed data, including a use case involving time series data to demonstrate how Standard-IA table class can help lower costs. You also learned how to use Cost Explorer to estimate the costs of your DynamoDB workloads and identify which tables it would be cost-effective to convert from Standard table class to Standard-IA table class.

Our colleagues at AWS have also developed tools such as the Table Class Evaluator Tool to determine the suitability of using Standard-IA class. ddb viz is another tool that provides reports on DynamoDB tables in your account with a special focus on cost optimization. You can use these tools to evaluate the suitability of switching a Standard table to Standard-IA table class.


About the authors

Anamika Kesharwani is a Partner Solutions Architect at Amazon Web Services specializing in databases. She works with ISV partners to provide guidance and help in the database adoption journey, helping them to improve the value of database solutions when using AWS. She is a DynamoDB SME and her focus is NoSQL databases. You’ll find her cooking, doing yoga, or trying out a new activity in her spare time.

Prathap Thoguru is an Enterprise Solutions Architect at Amazon Web Services. He’s an AWS certified professional in nine areas and specializes in data and analytics. He helps customers get started on and migrate their on-premises workloads to the AWS Cloud.

Karthiga Priya Chandran is a Data Architect in AWS Professional Services and helps customers build highly-scalable applications. She specializes in NoSQL databases and is an expert in Amazon DynamoDB.