AWS News Blog

DynamoDB – Price Reduction and New Reserved Capacity Model

We’re making Amazon DynamoDB an even better value today, with a price reduction and a new reserved capacity pricing model.

Behind the scenes, we’ve worked hard to make this happen. We have fine-tuned our storage and our processing model, optimized our replication pipeline, and taken advantage of our scale to drive down our hardware costs.

You get all of the benefits of DynamoDB – redundant, SSD-backed storage, high availabilty, and the ability to quickly and easily create tables that can scale to handle trillions of requests per year. To learn more about how we did this, check out Werner’s post.

DynamoDB Price Reduction
We are reducing the prices for Provisioned Throughput Capacity (reads and writes) by 35% and Indexed Storage by 75% in all AWS Regions. Here’s a handy chart:

  Provisioned Throughput
(per hour per 10 write units or 50 read units)
Indexed Storage
(per GB/month)
Region Old Price
New Price Old Price
New Price
 US East (Northern Virginia) $0.0100 $0.00650 $1.00 $0.250
 US West (Northern California) $0.0112 $0.00725 $1.12 $0.280
 US West (Oregon) $0.0100 $0.00650 $1.00 $0.250
 Europe (Ireland) $0.0113 $0.00735 $1.13 $0.283
 Asia Pacific (Singapore) $0.0114 $0.00740 $1.14 $0.285
 Asia Pacific (Tokyo) $0.0120 $0.00780 $1.12 $0.300
 Asia Pacific (Sydney) $0.0114 $0.00740 $1.14 $0.285
 South America (So Paulo) $0.0150 $0.00975 $1.50 $0.375
AWS GovCloud (US) $0.0120 $0.00780 $1.20 $0.300

This reduction takes effect on March 1, 2013 and will be applied automatically.

DynamoDB Reserved Capacity
If you are able to predict your need for DynamoDB read and write throughput within a given AWS Region, you can save even more money with our new Reserved Capacity pricing model.

You can now buy DynamoDB Reserved Capacity. If you need at least 5,000 read or write capacity units over a one or three year time period you can now enjoy savings that range from 54% to 77% when computed using the newly reduced On-Demand pricing described in the previous section. The net reduction with respect to the original pricing works out to be 85%.

If you purchase DynamoDB Reserved Capacity for a particular AWS Region, it will apply to all of your DynamoDB tables in the Region. For example, if you have a read-heavy application, you might purchase 20,000 read capacity units and 10,000 write capacity units. Once you have done this, you can use that capacity to provision tables as desired, for the duration of the reservation.

To purchase DynamoDB Reserved Capacity, go to the DynamoDB console, fill out the Reserved Capacity form (click on the button labeled “Purchase Reserved Capacity”), and we’ll take care of the rest. Later this year we’ll simplify the purchasing process, add additional Reserved Capacity options, and give you the ability to make purchases using tools and APIs.

DynamoDB Resources
If you would like to learn more about DynamoDB, you can watch the following sessions from AWS re:Invent:

DAT 101: Understanding AWS Database Options

DAT 102: Introduction to Amazon DynamoDB

DAT 302: Under the Covers of Amazon DynamoDB

MBL 301: Data Persistence to Amazon DynamoDB for Mobile Applications

I have been applying updates to my DynamoDB Libraries, Mappers, and Mock Implementations post as I become aware of them. Send updates to me ( and I’ll take care of the rest.

DynamoDB and Redshift
Amazon Redshift is a fast, fully managed, petabyte-scale data warehouse service. You might want to start thinking about interesting ways to combine the two services. You could use DynamoDB to store incoming data as it arrives at “wire” speed, do some filtering and processing, and then copy it over to Redshift for warehousing and deep analytics.

You can copy an entire DynamoDB table into Redshift with a single command:

copy favoritemovies FROM ‘dynamodb://my-favorite-movies-table’
credentials ‘aws_access_key_id=;aws_secret_access_key=’
readratio 50;

Read the Redshift article Loading Data from an Amazon DynamoDB Table to learn more.

— Jeff;



Jeff Barr

Jeff Barr

Jeff Barr is Chief Evangelist for AWS. He started this blog in 2004 and has been writing posts just about non-stop ever since.