AWS Developer Blog

Amazon DynamoDB Local Integration with AWS Toolkit for Visual Studio

by Norm Johanson | on | in .NET | Permalink | Comments |  Share

Recently, the Amazon DynamoDB team released DynamoDB Local, a great tool for local testing and working disconnected from the Internet. With version 1.6.3 of the AWS Toolkit for Visual Studio, DynamoDB Local was integrated to make it easy to manage your locally running DynamoDB.

In order to run DynamoDB Local, you need at least a JavaSE-1.6-compatible JRE installed, but we recommend 1.7.

Getting Started

To get started with DynamoDB Local

  1. In AWS Explorer, select Local (localhost).

  2. Now right-click on the DynamoDB node and select Connect to DynamoDB Local.

    • If you already have DynamoDB Local running, you can clear the Start new DynamoDB Local process check box. In this case, the toolkit attempts to connect to a currently running DynamoDB Local at the configured port.
    • If you haven’t installed DynamoDB Local yet, you can do that here by selecting the version you want, which is most likely the latest, and click Install. This downloads DynamoDB Local to the folder "dynamodb-local" under your home directory.
  3. Ensure that you have a proper path to Java set for the Java Executable Path and click OK to start a new instance of DynamoDB Local. AWS Explorer refreshes and shows any tables that you might have set up previously.

 

Connecting to DynamoDB Local

To connect to DynamoDB Local using the AWS SDK for .NET, you need to set the ServiceURL property on the AmazonDynamoDBConfig object for the client. Here is an example of setting up the DynamoDB client, assuming DynamoDB Local is running on port 8000.

var config = new AmazonDynamoDBConfig
{
   ServiceURL = "http://localhost:8000/"
}

// Access key and secret key are not required
// when connecting to DynamoDB Local and
// are left empty in this sample.
var client = new AmazonDynamoDBClient("", "", config);

 

DynamoDB Local Test Tool Integration for Eclipse

We’re excited to announce that the AWS Toolkit for Eclipse now includes integration with the Amazon DynamoDB Local Test Tool. The DynamoDB Local Test Tool allows you to develop and test your application against a DynamoDB-compatible database running locally — no Internet connectivity or credit card required. When your application is ready for prime time, all you need to do is update the endpoint given to your AmazonDynamoDBClient. Neato!

With the DynamoDB Local Test Tool integrated into the AWS Toolkit for Eclipse, using it is easier than ever. Make sure you have a recent version of the Amazon DynamoDB Management plugin (v201311261154 or later) installed and follow along below!

Installing DynamoDB Local

First, head to the Eclipse preferences and make sure you have a JavaSE-1.7 compatible JRE installed. If not, you’ll need to install one and configure Eclipse to know where it is.

Eclipse Execution Environments Preference Page

Then, head to the new DynamoDB Local Test Tool preference page, where you can specify a directory to install the DynamoDB Local Test Tool and a default TCP port for it to bind to.

DynamoDB Local Test Tool Preference Page

The page also lists versions of the DynamoDB Local Test Tool available for installation. There are currently two: the original version (2013-09-12) and a newer version (2013-12-12) which includes support for Global Seconday Indexes. When the DynamoDB team releases future versions of the test tool, they will also show up in this list. Select the latest version and hit the Install button that’s above the list of versions; the DynamoDB Local Test Tool will be downloaded and installed in the directory you specified.

Starting DynamoDB Local

Once the test tool is installed, pop open the AWS Explorer view and switch it to the Local (localhost) region. This psuedo-region represents test tool services running locally on your machine.

Selecting the Local (localhost) Region

For now, you’ll see a single Amazon DynamoDB node representing the DynamoDB Local Test Tool. Right-click this node and select Start DynamoDB Local.

Starting DynamoDB Local

This will bring up a wizard allowing you to pick which version of the DynamoDB Local Test Tool to launch, and the port to which it should bind. Pick the version you just installed, give it a port (if you didn’t specify a default earlier), and hit Finish. A console window will be opened that should print out a few lines similar to the below when the DynamoDB Local Test Tool finishes initializing itself:

DynamoDB Local Console Output

Using DynamoDB Local

You can now use the DynamoDB Management features of the toolkit to create tables in your local DynamoDB instance, load some data into them, and perform test queries against your tables.

The DynamoDB Table Editor

To write code against DynamoDB Local, simply set your client’s endpoint and region appropriately:

// The secret key doesn't need to be valid, DynamoDB Local doesn't care.
AWSCredentials credentials = new BasicAWSCredentials(yourAccessKeyId, "bogus");
AmazonDynamoDBClient client = new AmazonDynamoDBClient(credentials);

// Make sure you use the same port as you configured DynamoDB Local to bind to.
client.setEndpoint("http://localhost:8000");

// Sign requests for the "local" region to read data written by the toolkit.
client.setSignerRegionOverride("local");

And away you go! As mentioned above, DynamoDB Local doesn’t care if your credentials are valid, but it DOES create separate local databases for each unique access key ID sent to it, and for each region you say you’re authenticating to. If you have configured the Toolkit with a real set of AWS credentials, you’ll want to use the same access key ID when programmatically interacting with DynamoDBLocal so you read from and write to the same local database. Since the Toolkit uses the "local" region to authenticate to DynamoDBLocal, you’ll also want to override your client to authenticate to the "local" region as well.

Conclusion

DynamoDB Local is a great way to play around with the DynamoDB API locally while you’re first learning how to use it, and it’s also a great way to integration-test your code even if you’re working without a reliable Internet connection. Now that the AWS Toolkit for Eclipse makes it easy to install and use, you should definitely check it out!

Already using DynamoDB Local or the new Eclipse integration? Let us know how you like it and how we can make it even better in the comments!

Release: AWS SDK for PHP 2.5.1

by Michael Dowling | on | in PHP | Permalink | Comments |  Share

We would like to announce the release of version 2.5.1 of the AWS SDK for PHP. This release updates to the Amazon EC2 client, the Auto Scaling client, and addresses several issues. Please see the CHANGELOG for a full list of changes.

Install the SDK

AWS SDK ZF2 Module 1.2.1

by Jeremy Lindblom | on | in PHP | Permalink | Comments |  Share

We would like to announce the availability of version 1.2.1 of the AWS SDK ZF2 Module. This is a patch release that fixes an issue in the S3RenameUpload filter. Object keys based on the uploaded filename are now handled better on Windows.

Ruby SDK Version 2 and Memoization

by Trevor Rowe | on | in Ruby | Permalink | Comments |  Share

Version 1 of the AWS SDK for Ruby (aws-sdk gem) provides a higher-level abstraction for working with AWS resources. These resources can be used to get information about AWS objects and operate on them as well. For example:

AWS.ec2.instances.each do |i|
  puts i.instance_id + ' => ' + i.status.to_s
end

The problem with the example above is if you do not enable memoization, it will make n + 1 requests to Amazon EC2. The first request retries a list of instances, and then it makes one more request per instance to fetch the status. An experienced AWS developer will know that all of this information can be retrieved in a single call to DescribeInstances with Amazon EC2.

To address this issue, the v1 Ruby SDK introduced a feature called memoization which allows the SDK to used cached values inside a block.

# now only one request is made
AWS.memoize do
  AWS.ec2.instances.each do |i|
    puts i.instance_id + ' => ' + i.status.to_s
  end
end 

For more background information you can read this blog post.

Memoization is Not Obvious

If you are unaware of this feature, your code will still execute, and will produce a correct response. Unfortunately, it is also likely that a large number of unnecessary requests will be made. These requests can significantly increase the execution time and can result in an account getting throttled. This usually results in a frustrating search for why extra requests are being made, which finally leads to #memoize.

Memoization Removed in V2

In version 2 of the Ruby SDK (aws-sdk-core gem), there is no Aws.memoize method. Instead we chose a design that removes the need to provide this hook.

  • Response objects have a #data attribute. Accessing response data will never trigger another request.
  • The upcoming higher-level resource abstractions are also going to provide access to the data for a resource and an explicit #refresh! method that will refresh the data on demand.

If you haven’t had a chance to take the version 2 Ruby SDK for a spin, I encourage you to try it out. It installs via a separate gem (aws-sdk-core) and uses a separate namespace (Aws vs. AWS) so that you can use it in the same process.

Also, please share any feedback or ideas!

Parameter Validation

by Trevor Rowe | on | in Ruby | Permalink | Comments |  Share

One of my favorite features of version 2 of the AWS SDK for Ruby (aws-sdk-core gem) is the new parameter validation system. One of the challenges in using an API library is understanding how to specify request parameters. During development stages it is common to make mistakes and to generate errors.

Using the version 1 Ruby SDK (aws-sdk gem), I might want to use the ‘2012-08-10’ Amazon DynamoDB API version to create a table. Without knowing the all of the required parameters I might try something like this:

ddb = AWS::DynamoDB::Client.new(api_version: '2012-08-10')
ddb.create_table(
  table: 'name',
  provisioned_throughput: { read_capacity_units: 'Over 9000!' }
)
#=> raises AWS::Core::OptionGrammar::FormatError: expected integer value for key read_capacity_units

Oops! It’s easy enough to correct that error, but there are more validation errors waiting to raise.

Version 2 Validation

Version 2 of the Ruby SDK aggregates all validation errors and raises them in a single pass.

ddb = Aws::DynamoDB.new(api_version: '2012-08-10')
ddb.create_table(
  table: 'name',
  provisioned_throughput: { read_capacity_units: 'Over 9000!' }
)

# raises the following:

ArgumentError: parameter validator found 6 errors:
  - missing required parameter params[:attribute_definitions]
  - missing required parameter params[:table_name]
  - missing required parameter params[:key_schema]
  - unexpected value at params[:table]
  - missing required parameter params[:provisioned_throughput][:write_capacity_units]
  - expected params[:provisioned_throughput][:read_capacity_units] to be an integer

Each error gives the context as well as information on what needs to be corrected before the request can be sent. I can now correct all of the validation errors in a single pass.

Formatting Examples

Even better, the V2 API documentation gives formatting examples of how to call each operation. Here is a link to the create table method docs with an example.

We’ve been getting a lot of great customer feedback on our GitHub project for the version 2 Ruby SDK. Keep it coming! We would love to hear what you think is missing.

Flexible Gem Dependencies

by Trevor Rowe | on | in Ruby | Permalink | Comments |  Share

Version 1 of the AWS SDK for Ruby (aws-sdk gem) depends on Nokogiri and JSON. While these are robust and performant libraries, there are multiple reasons why a developer may not want to be locked into these dependencies:

  • I might want to use pure Ruby solutions that do not require native extensions for maximum portability.
  • I might want to use Oj/Ox for performance reasons
  • I might require more flexibility in choosing the version of dependent gem.

MultiJSON and MultiXML

In version 2 of the Ruby SDK (aws-sdk-core gem), we depend on the multi_json and multi_xml gems. By default these gems use the fastest available XML and JSON libraries available. If you have not explicitly installed any libraries, then those available in the Ruby standard library will be used. These are REXML and JSON.

I very much enjoy Oj and Ox. They are fast XML and JSON parsers/serializers. They are written in c, but have no external dependencies. You can enable these libraries by adding them to your Gemfile as follows:

gem 'oj'
gem 'ox'

MultiJSON and MultiXML will choose these by default. The same approach works with Nokogiri. Just add gem 'nokogiri' to your Bundler Gemfile and you are set.

We are working hard to ensure the SDK depends on as few gems as possible and to ensure it works on as many platforms as possible. If you have any feedback or suggestions on the version 2 Ruby SDK, please let us know!

A Great 2013 for the AWS SDK for PHP

by Jeremy Lindblom | on | in PHP | Permalink | Comments |  Share

2013 was a fantastic year for the AWS SDK for PHP! We have had a lot of fun working on the SDK and connecting with our users and the PHP community through our blog, our Twitter account, and by attending various conferences. Thanks for your continual support and feedback. We want to take a few moments to reflect on the things we’ve accomplished this year.

The SDK

This year, we brought Version 2 of the AWS SDK for PHP into the spotlight. In mid-March, we achieved full service coverage support for Version 2. Since then, we’ve added several features to the SDK and have also added support for even more services and regions.

Some of the new features added in the SDK this year include:

We now have over 275,000 downloads of the SDK on Composer/Packagist and almost 850 stars on GitHub. We also have a lot of great content in our user guide and API documentation.

Our blog

Since we first announced our blog at the end of May, we’ve been steadily posting a mixture of technical content and announcements. We hope that you are subscribed to the blog and that you find our posts helpful. Here are the most popular posts from 2013:

If you have any ideas for topics for 2014, please let us know in the comments.

Presentations and conferences

It was our pleasure to give presentations about AWS and the SDK in various places around the United States during 2013. Here are some of the places we presented at:

Here are links to the slides used in our presentations, in case you missed them in our other blog posts:

If you are interested in having us present to your conference during 2014, please contact us.

See you next year

Thank you to all of our users and contributors! We plan to bring you more features, improvements, documentation, and tips for the AWS SDK for PHP throughout 2014. Make sure to follow us on Twitter at @awsforphp to stay up-to-date. Have a great new year!

IAM Credential Rotation (Access Key Management for .NET Applications – Part 3)

by Milind Gokarn | on | in .NET | Permalink | Comments |  Share

In the previous post in this series, we talked about using IAM users instead of using the root access keys of your AWS account. In this post, we’ll talk about another security best practice, regularly rotating your credentials.

Instead of rotating credentials only when keys are compromised, you should regularly rotate your credentials. If you follow this approach, you’ll have a process in place that takes care of rotating keys if they are compromised, instead of figuring it out when the event takes place. You’ll also have some degree of protection against keys that are compromised without your knowledge, as those keys will only be valid for a certain period, before they are rotated.

We use the following steps for access key rotation to minimize any disruption to running applications:

  • Generate new access key
  • Securely distribute the access key to your applications
  • Disable the old access key
  • Make sure that your applications work with the new key
  • Delete the old access key

Here is the code that performs some of these steps. How you implement distributing the key to your applications and testing the applications is specific to your solution.

var iamClient = new AmazonIdentityManagementServiceClient(ACCESS_KEY, SECRET_KEY, RegionEndpoint.USWest2);
            
// Generate new access key for the current account
var accessKey = iamClient.CreateAccessKey().AccessKey;
	
//
// Store the access key ID (accessKey.AccessKeyId) and 
// secret access key (accessKey.SecretAccessKey)
// securely and distribute it to your applications.
//

// Disable the old access key
iamClient.UpdateAccessKey(new UpdateAccessKeyRequest
{
  AccessKeyId = OLD_ACCESS_KEY_ID,
  Status = StatusType.Inactive
});

// 
// Confirm that your applications pick the new access key
// and work properly using the new key.
//

// Delete the old access key.
iamClient.DeleteAccessKey(new DeleteAccessKeyRequest
{
  AccessKeyId = OLD_ACCESS_KEY_ID
});

If your applications don’t work properly after switching to the new access key, you can always reactivate the old access key (from inactive state) and switch back to it. Only delete the old access keys after testing your applications as they cannot be restored once deleted.

 

Using RSpec 3

by Trevor Rowe | on | in Ruby | Permalink | Comments |  Share

Using RSpec 3

I have been a long time user of RSpec and many of the Ruby projects I work with use RSpec as the primary testing framework. It provides an expressive specification DSL. As you may know, RSpec 3 is currently in the works. I have blogged a few times recently about using MiniTest. I decided I should also give RSpec some love.

RSpec 3 Changes

You can find an excellent summary of the changes and planned changes for RSpec 3 over here. Some of the primary changes:

  • No support for Ruby 1.8.6. You should strongly consider upgrading any projects using Ruby 1.8 to 2.0+.
  • Less magic by default, and RSpec 3 now provides a zero monkey patching mode.

Upgrading to RSpec 3 required very few changes for me. The primary difference is how I assert expectations. I now use the new expect helpers:

# using should helpers
obj_under_test.attribute.should eq('some-value')
lambda { obj_under_test.something_invalid }.should raise_error(...)

# using expect
expect(obj_under_test.attribute).to eq('some-value')
expect { obj_under_test.something_invalid }.to raise_error(...)

Using RSpec 3 Now

To try out the pre-release version of RSpec 3, get started by adding the following to your project’s Gemfile:

gem 'rspec', github: 'rspec/rspec'
gem 'rspec-core', github: 'rspec/rspec-core'
gem 'rspec-mocks', github: 'rspec/rspec-mocks'
gem 'rspec-expectations', github: 'rspec/rspec-expectations'
gem 'rspec-support', github: 'rspec/rspec-support'

I have been using RSpec 3 for a few months now and have found it to be very stable and enjoyable to work with. Happy Testing!