AWS Developer Blog

Announcing the Amazon DynamoDB Document Client in the AWS SDK for JavaScript

Version 2.2.0 of the AWS SDK for JavaScript introduces support for the document client abstraction in the AWS.DynamoDB namespace. The document client abstraction makes it easier to read and write data to Amazon DynamoDB with the AWS SDK for JavaScript. Now you can use native JavaScript objects without annotating them as AttributeValue types.

This article describes how to use the document client abstraction to make requests to Amazon DynamoDB.

Making Requests with the Document Client

The following example shows a PutItem request to Amazon DynamoDB with the document client. Note that you can use native JavaScript objects without annotating them as AttributeValue types. The document client annotates the JavaScript object that you provide as input with AttributeValue types before making a request to DynamoDB.

For a list of supported API operations, you can check out the API documentation.

Example

var docClient = new AWS.DynamoDB.DocumentClient({region: 'us-west-2'});

var params = {
    Item: {
        hashkey: 'key',
        boolAttr: true,
        listAttr: [1, 'baz', true]
        mapAttr: {
            foo: 'bar'
        }
    },
    TableName: 'table'
};

docClient.put(params, function(err, data){
    if (err) console.log(err);
    else console.log(data);
});

Support for Sets

The AWS.DynamoDB.DocumentClient.createSet() is a convenience method for creating a set. This method accepts a JavaScript array and a map of options. The type of set is inferred from the type of the first element in the list. Amazon DynamoDB currently supports three types of sets: string sets, number sets, and binary sets.

Example

var docClient = new AWS.DynamoDB.DocumentClient({region: 'us-west-2'});

var params = {
    Item: {
        hashkey: 'key',
        stringSet: docClient.createSet(['a', 'b']);
        numberSet: docClient.createSet([1, 2]);
        binarySet: docClient.createSet([new Buffer(5), new Uint8Array(5)]);
    },
    TableName: 'table'
};

docClient.put(params, function(err, data){
    if (err) console.log(err);
    else console.log(data);
});

You can also validate the uniformity of the supplied list by setting validate: true in the options passed in to the createSet() method.

// This is a valid string set
var validSet = docClient.createSet(['a', 'b'], {validate: true});

// This is an invalid number set
var invalidSet = docClient.createSet([1, 'b'], {validate: true});

Using Response Data from the Document Client

The document client also unmarshalls response data annotated with AttributeValue types from DynamoDB to native JavaScript objects that can be easily used with other JavaScript code.

Example

var docClient = new AWS.DynamoDB.DocumentClient({region: 'us-west-2'});

var params = {
    Key: {
        hashkey: 'key',
    },
    TableName: 'table'
};

docClient.get(params, function(err, data){
    if (err) console.log(err);
    else console.log(data); 
    /**
     *  { 
     *      Item: { 
     *          hashkey: 'key'
     *          boolAttr: true,
     *          listAttr: [1, 'baz', true]
     *          mapAttr: {
     *              foo: 'bar'
     *          }
     *      }
     *  }
     **/
});

For more information about the document client and its supported operations, see the API documentation.

We hope this simplifies the development of applications with the AWS SDK for JavaScript and Amazon DynamoDB. We’d love to hear what you think about the document client abstraction, so leave us a comment here, or on GitHub, or tweet about it @awsforjs.

Xamarin Support Out of Preview

by Pavel Safronov | on | in .NET | Permalink | Comments |  Share

Last month, with the release of version 3 of the AWS SDK for .NET, Xamarin and Portable Class Library (PCL) support was announced as an in-preview feature. We’ve worked hard to stabilize this feature and with today’s release, we are labeling Xamarin and PCL support production-ready. This applies to Windows Phone and Windows Store support, too. If you’ve been waiting for the production-ready version of the SDK for these platforms, you can now upgrade from version 2 to this release of the SDK.

The immediate impact of this push is that the AWSSDK.CognitoSync, AWSSDK.SyncManager, and AWSSDK.MobileAnalytics NuGet packages are no longer marked as preview. The versions of other AWS SDK NuGet packages have been incremented.

Happy coding!

S3 Transfer Utility Upgrade

by Tyler Moore | on | in .NET | Permalink | Comments |  Share

Version 3 of the AWS SDK for .NET includes an update to the S3 transfer utility. Before this update, if an S3 download of a large file failed, the entire download would be retried. Now the retry logic has been updated so that any retry attempts will use bits that have already been laid down. This means better performance for customers. Because the retry attempt no longer requests the entire file, there is less data to stream from S3 when a download is interrupted.

As long as you are already using the S3 transfer utility, there is no code work required to take advantage of this update. It’s available in the AWSSDK.S3 package in version 3.1.2 and later. For more information about the S3 transfer utility, see Amazon S3 Transfer Utility for Windows Store and Windows Phone.

The AWS CLI Topic Guide

by Kyle Knapp | on | in AWS CLI | Permalink | Comments |  Share

Hi everyone! This blog post is about the AWS CLI Topic Guide, a feature that was added in version 1.7.24 of the CLI. The AWS CLI Topic Guide allows users to discover and read information about a CLI feature or its behavior at a level of detail not found in-depth in the Help page of a single command.

Discovering Topics

Run the following command to discover the topics available:

$ aws help topics

A Help page with a list of available topics will be displayed. Here is an example list:

AVAILABLE TOPICS
   General
       o config-vars: Configuration Variables for the AWS CLI

       o return-codes: Describes the various return codes of the AWS CLI

   S3
       o s3-config: Advanced configuration for AWS S3 Commands

In this case, the returned topics (config-vars, return-codes, and s3-config) fall into two categories: General and S3. Each topic belongs to a single category only, so you will never see repeated topics in the list.

Accessing Topics

Run the following command to access a topic’s contents:

$ aws help topicname

where topicname is the name of a topic listed in the output of the aws help topics command. For example, if you wanted to access the return-codes topic to learn more about the various return codes in the CLI, all you would have to type is:

$ aws help return-codes

This will display a Help page that describes the various return codes you might recieve when running a CLI command and the scenarios for particular status codes.

The AWS CLI Topic Guide is also available online.

Conclusion

The AWS CLI Topic Guide is a great source of information about the CLI. If you have topics you would like us to add, submit a request through our GitHub repository.

Follow us on Twitter @AWSCLI and let us know what you’d like to read about next! Stay tuned for our next post.

 

Managing Dependencies with AWS SDK for Java – Bill of Materials module (BOM)

by Manikandan Subramanian | on | in Java | Permalink | Comments |  Share

Every Maven project specifies its required dependencies in the pom.xml file. The AWS SDK for Java provides a Maven module for every service it supports. To use the Java client for a service, all you need to do is specify the group ID, artifact ID and the Maven module version in the dependencies section of pom.xml.

The AWS SDK for Java introduces a new Maven bill of materials (BOM) module, aws-java-sdk-bom, to manage all your dependencies on the SDK and to make sure Maven picks the compatible versions when depending on multiple SDK modules. You may wonder why this BOM module is required when the dependencies are specified in the pom.xml file. Let me take you through an example. Here is the dependencies section from a pom.xml file:

  <dependencies>
    <dependency>
      <groupId>com.amazonaws</groupId>
      <artifactId>aws-java-sdk-ec2</artifactId>
      <version>1.10.2</version>
    </dependency>
    <dependency>
      <groupId>com.amazonaws</groupId>
      <artifactId>aws-java-sdk-s3</artifactId>
      <version>1.10.5</version>
    </dependency>
    <dependency>
      <groupId>com.amazonaws</groupId>
      <artifactId>aws-java-sdk-dynamodb</artifactId>
      <version>1.10.10</version>
    </dependency>
  <dependencies>

Here is the Maven’s dependency resolution for the above pom.xml file:

As you see, the aws-java-sdk-ec2 module is pulling in an older version of aws-java-sdk-core. This intermixing of different versions of SDK modules can create unexpected issues. To ensure that Maven pulls in the correct version of the dependencies, import the aws-java-sdk-bom into your dependency management section and specify your project’s dependencies, as shown below.

  <dependencyManagement>
    <dependencies>
      <dependency>
        <groupId>com.amazonaws</groupId>
        <artifactId>aws-java-sdk-bom</artifactId>
        <version>1.10.10</version>
        <type>pom</type>
        <scope>import</scope>
      </dependency>
    </dependencies>
  </dependencyManagement>
  
  <dependencies>
    <dependency>
      <groupId>com.amazonaws</groupId>
      <artifactId>aws-java-sdk-ec2</artifactId>
    </dependency>
    <dependency>
      <groupId>com.amazonaws</groupId>
      <artifactId>aws-java-sdk-s3</artifactId>
    </dependency>
    <dependency>
      <groupId>com.amazonaws</groupId>
      <artifactId>aws-java-sdk-dynamodb</artifactId>
    </dependency>
  </dependencies>

The Maven version for each dependency will be resolved to the version specified in the BOM. Notice that when you are importing a BOM, you will need to mention the type as pom and the scope as import.

Here is the Maven’s dependency resolution for the above pom.xml file:

As you can see, all the AWS SDK for Java modules are resolved to a single Maven version. And upgrading to a newer version of the AWS SDK for Java requires you to change only the version of aws-java-sdk-bom module being imported.

Have you been using modularized Maven modules in your project? Please leave your feedback in the comments.

AWS Workshop and Hackathon at PNWPHP

by Jeremy Lindblom | on | in PHP | Permalink | Comments |  Share

In September, the Pacific Northwest PHP Conference (PNWPHP) is happening in Seattle. It’s just down the street from us, and we decided to partner with the them to host an AWS Workshop and Hackathon on September 10th, 2015.

The workshop portion will serve as kind of AWS bootcamp for PHP developers, and will include a few presentations about AWS services and architecture, the AWS SDK for PHP, and running PHP applications on AWS. You can see a full list of the presentations and speakers on the PNWPHP website.

The hackathon portion will allow people to team up and create something using AWS services and the SDK. Like most hackathons, there will be food and prizes involved. Hackathon participants will also receive AWS credits through the AWS Activate program to cover the costs of the services they will be using during the hackathon.

Tickets for AWS Workshop and Hackathon are sold separately from the main PNWPHP conference, so whether you end up attending the main conference or not, you still have the opportunity to join us at our workshop/hackathon. In fact, you can use the discount code "AWSHACK" to get your AWS Workshop and Hackathon ticket for a 50% discount. Head to the PNWPHP registration page to get your ticket.

Whether you are a Seattle native, or you are in town for PNWPHP, we hope to see you at our special AWS Workshop and Hackathon.

Using AWS CodeCommit from Eclipse

Earlier this month, we launched AWS CodeCommit — a managed revision control service that hosts Git repositories and works with existing Git-based tools.

If you’re an Eclipse user, it’s easy to use the EGit tools in Eclipse to work with AWS CodeCommit. This post shows how to publish a project to AWS CodeCommit so you can start trying out the new service.

Configure SSH Authentication

To use AWS CodeCommit with Eclipse’s Git tooling, you’ll need to configure SSH credentials for accessing CodeCommit. This is an easy process you’ll only need to do once. The AWS CodeCommit User Guide has a great walkthrough describing the exact steps to create a keypair and register it with AWS. Make sure you take the time to test your SSH credentials and configuration as described in the walkthrough.

Create a Repository

Next, we’ll create a new Git repository using AWS CodeCommit. The AWS CodeCommit User Guide has instructions for creating repositories through the AWS CLI or the AWS CodeCommit console.

Here’s how I used the AWS CLI:

% aws --region us-east-1 codecommit create-repository 
      --repository-name MyFirstRepo 
      --repository-description "My first CodeCommit repository"
{
  "repositoryMetadata": {
    "creationDate": 1437760512.195,
    "cloneUrlHttp": 
       "https://git-codecommit.us-east-1.amazonaws.com/v1/repos/MyFirstRepo",
    "cloneUrlSsh": 
       "ssh://git-codecommit.us-east-1.amazonaws.com/v1/repos/MyFirstRepo",
    "repositoryName": "MyFirstRepo",
    "Arn": "arn:aws:codecommit:us-east-1:963699449919:MyFirstRepo",
    "repositoryId": "c4ed6846-5000-44ce-a808-b1862766d8bc",
    "repositoryDescription": "My first CodeCommit repository",
    "accountId": "963699449919",
    "lastModifiedDate": 1437760512.195
  }
}

Whether you use the CLI or the console to create your CodeCommit repository, make sure to copy the cloneUrlSsh property that’s returned. We’ll use that in the next step when we clone the CodeCommit repository to our local machine.

Create a Clone

Now we’re ready to use our repository locally and push one of our projects into it. The first thing we need to do is clone our repository so that we have a local version. In Eclipse, open the Git Repositories view (Window -> Show View -> Other…) and select the option to clone a Git repository.

In the first page of the Clone Git Repository wizard, paste the Git SSH URL from your CodeCommit repository into the URI field. Eclipse will parse out the connection protocol, host, and repository path.

Click Next. The CodeCommit repository we created is an empty, or bare, repository, so there aren’t any branches to configure yet.

Click Next. On the final page of the wizard, select where on your local machine you’d like to store the cloned repository on your local machine.

Push to Your Repository

Now that we’ve got a local clone of our repository, we’re ready to start pushing a project into it. Select a project and use Team -> Share to connect that project with the repository we just cloned. In my example, I simply created a new project.

Next use Team -> Commit… to make the initial check-in to your cloned repo.

Finally, use Team -> Push Branch… to push the master branch in your local repository up to your CodeCommit repository. This will create the master branch on the CodeCommit repository and configure your local repo for upstream pushes and pulls.

Conclusion

Your project is now configured with the EGit tools in Eclipse and set up to push and pull from a remote AWS CodeCommit repository. You can take advantage of all the EGit tooling in Eclipse to work with your repository and easily push and pull changes from your AWS CodeCommit repository. Have you tried using AWS CodeCommit yet?

DynamoDB Table Cache

by Pavel Safronov | on | in .NET | Permalink | Comments |  Share

Version 3 of the AWS SDK for .NET includes a new feature, the SDK Cache. This is an in-memory cache used by the SDK to store information like DynamoDB table descriptions. Before version 3, the SDK retrieved table information when you constructed a Table or DynamoDBContext object. For example, the following code creates a table and performs several operations on it. The LoadTable method makes a DescribeTable call to DynamoDB, so this sample will make three service calls: DescribeTable, GetItem, and UpdateItem.

var table = Table.LoadTable(ddbClient, "TestTable");
var item = table.GetItem(42);
item["Updated"] = DateTime.Now;
table.UpdateItem(item);

In most cases, your application will use tables that do not change, so constantly retrieving the same table information is wasteful and unnecessary. In fact, to keep the number of service calls to a minimum, the best option is to create a single copy of the Table or DynamoDBContext object and keep it around for the lifetime of your application. This, of course, requires a change to the way your application uses the AWS SDK for .NET.

We will now attempt to retrieve table information from the SDK Cache. Even if your code is constructing a new Table or DynamoDBContext object for each call, the SDK will only make a single DescribeTable call per table, and will keep this data around for the lifetime of the process. So if you ran the preceding code twice, only the first invocation of LoadTable would result in a DescribeTable call.

This change will reduce the number of DescribeTable calls your application makes, but in some cases you may need to get the most up-to-date table information from the service (for example, if you are developing a generic DynamoDB table scanner utility). You have two options: periodically clear the table metadata cache or disable the SDK Cache.

The first approach is to call Table.ClearTableCache(), a static method on the Table class. This operation will clear out the entire table metadata cache, so any Table or DynamoDBContext objects you create after this point will result in one new DescribeTable call per table. (Of course, after the data is retrieved once, it will again be stored in the cache. This approach will work only if you know when your table metadata changes and clear the cache intermittently.)

The second approach is to disable the SDK Cache, forcing the SDK to always retrieve the current table configuration. This can be accomplished through code or the app.config/web.config file, as illustrated below. (Disabling the SDK Cache will revert to version 2 behavior, so unless you hold on to the Table or DynamoDBContext objects as you create them, your application will end up making DescribeTable service calls.)

Disabling the cache through code:

// Disable SDK Cache for the entire application
AWSConfigs.UseSdkCache = false;

Disabling the cache through app.config:

<configuration>
  <appSettings>
	<!-- Disables SDK Cache for the entire application -->
    <add key="AWSCache" value="false" />
  </appSettings>
</configuration>

Version 3 of the AWS SDK for .NET Out of Preview

by Norm Johanson | on | in .NET | Permalink | Comments |  Share

Back in February, we announced our intention to release a new major version of the AWS SDK for .NET. In April, we released a preview on NuGet. After receiving great feedback from users, today we are taking version 3 of the AWS SDK for .NET out of preview. This means the preview flag has been removed from the NuGet packages. The SDK is now included in our MSI installer from our website.

Version 3 is a new, modularized SDK. Every service is a separate assembly and distributed as a separate NuGet package. Each service has a dependency on a common runtime, AWSSDK.Core. This has been a major request from our users, especially now that AWS has grown to over 50 services. This design also gives SDK users better control over when to upgrade to the newest service updates.

We wanted to make the transition to version 3 as easy possible, so there are very few breaking changes to the public API. For the full list of changes, see our API Reference which contains a migration guide.

Our hope is that most users will just need to replace the old reference to version 2 and add the reference to the services they are using. If you are using NuGet to get the SDK, the reference to our core runtime package will be added automatically. If you are getting the SDK from the installer on our website, then you will need to add a reference to AWSSDK.Core.

Xamarin Preview

We recently announced a public preview of Xamarin support, which is part of version 3. Even though the SDK is now widely available, Xamarin and the Portable Class Library version of the SDK are still in preview. We encourage you to try the new Xamarin support and give us feedback, but we are not ready for users to publish production applications just yet. Users with an immediate need for Windows Phone and Windows Store support should continue using version 2 until the PCL version of the SDK version 3 is production-ready.

PowerShell

With our move to version 3, we have also switched our AWS Tools for Windows PowerShell to the new SDK. The version numbers for AWS SDK for .NET and our AWS Tools for Windows PowerShell are kept in sync, so AWS Tools for Windows PowerShell is getting a major version bump to 3. There are otherwise no major changes to AWS Tools for Windows PowerShell.

Changes to Our Installer

The installer has been updated to contain version 3 of the SDK, but it also contains version 2 for users who are not ready to move to version 3. The Portable Class Library version of the SDK (which includes Xamarin support) is only distributed through NuGet and will not be available through the installer. The Portable Class Library uses platform-specific dependencies which are automatically resolved when references are added through NuGet. This would be a complex process if done manually or without NuGet.

Packages on NuGet

For an up to date list of the version 3 NuGet packages check out the NuGet section in the SDK’s github README.md.

Invoking AWS Lambda Functions from Java

by David Murray | on | in Java | Permalink | Comments |  Share

AWS Lambda makes it incredibly easy and cost-effective to run your code at arbitrary scale in the cloud. Simply write the handler code for your function and upload it to Lambda. The service takes care of hosting and scaling the function for you. And in case you somehow missed it, it now supports writing function handlers in Java!

Although many use cases for Lambda involve running code in response to triggers from other AWS services like Amazon S3 or Amazon Cognito, you can also invoke Lambda functions directly, making them an easy and elastically scalable way to decompose an application into reusable microservices. In this post, we’ll assume we’ve got a Lambda function named “CountCats” that accepts an S3 bucket and key for an image, analyzes the image to count the number of cats the image contains, and returns that count to the caller. An example request to this service might look like:

{
  "bucket": "pictures-of-cats",
  "key": "three-cool-cats.jpg"
}

And an example response might look like:

{
  "count": 3
}

To invoke this function from Java code, we’ll first define POJOs representing the input and output JSON:

public class CountCatsInput {

  private String bucketName;
  private String key;

  public String getBucketName() { return bucketName; }
  public void setBucketName(String value) { bucketName = value; }

  public String getKey() { return key; }
  public void setKey(String value) { key = value; }
}

public class CountCatsOutput {

  private int count;

  public int getCount() { return count; }
  public void setCount(int value) { count = value; }
}

Next we’ll define an interface representing our microservice, and annotate it with the name of the Lambda function to invoke when it’s called:

import com.amazonaws.services.lambda.invoke.LambdaFunction;

public interface CatService {
  @LambdaFunction(functionName="CountCats")
  CountCatsOutput countCats(CountCatsInput input);
}

We can then use the LambdaInvokerFactory to create an implementation of this interface that will make calls to our service running on Lambda (note that providing a lambdaClient is optional, if one is not provided a default client will be used):

import com.amazonaws.services.lambda.AWSLambdaClientBuilder;
import com.amazonaws.services.lambda.invoke.LambdaInvokerFactory;

final CatService catService = LambdaInvokerFactory.builder()
 .lambdaClient(AWSLambdaClientBuilder.defaultClient())
 .build(CatService.class);

Finally, we invoke our service using this proxy object:

CountCatsInput input = new CountCatsInput();
input.setBucketName("pictures-of-cats");
input.setKey("three-cute-cats");

int cats = catService.countCats(input).getCount();

When called, the input POJO is serialized to JSON and sent to your Lambda function; the function’s result is transparently deserialized back into your output POJO. Details like authentication, timeouts, and retries in case of transient network issues are handled by the underlying AWSLambdaClient.

Are you using Lambda to host a microservice and calling it from Java code? Let us know how it’s going in the comments or over on our GitHub repository!