Tag: .NET

Updated Amazon Cognito Credentials Provider

by Pavel Safronov | on | in .NET | Permalink | Comments |  Share

Amazon Cognito allows you to get temporary AWS credentials, so that you don’t have to distribute your own credentials with your application. Last year we added a Cognito credentials provider to the AWS SDK for .NET to simplify this process.

With the latest update to Cognito, we are now making it even easier to use Cognito with your application. Using the latest version of the SDK, you no longer need to specify IAM roles in your application if you have already associated the correct roles with your identity pool.

Below is an example of how you can construct and use the new credentials provider:

CognitoAWSCredentials credentials = new CognitoAWSCredentials(
    identityPoolId,   // identity pool id
    region);          // identity pool region

using (var s3Client = new AmazonS3Client(credentials))

Something to note is that even if you have associated roles with an identity pool, you can still specify IAM roles—even ones that are different from the roles configured on the identity pool—when creating these credentials. This gives you finer control over what resources and operations these credentials have access to.

Cross-Account IAM Roles in Windows PowerShell

by Brian Beach | on | in .NET | Permalink | Comments |  Share

As a company’s adoption of Amazon Web Services (AWS) grows, most customers adopt a multi-account strategy. Some customers choose to create an account for each application, while others create an account for each business unit or environment (development, testing, production). Whatever the strategy, there is often a use case that requires access to multiple accounts at once. This post examines cross-account access and the AssumeRole API, known as Use-STSRole in Windows PowerShell.

A role consists of a set of permissions that grant access to actions or resources in AWS. An application uses a role by calling the AssumeRole API function. The function returns a set of temporary credentials that the application can use in subsequent function calls. Cross-account roles allow an application in one account to assume a role (and act on resources) in another account.

One common example of cross-account access is maintaining a configuration management database (CMDB). Most large enterprise customers have a requirement that all servers, including EC2 instances, must be tracked in the CMDB. Example Corp., shown in Figure 1, has a Payer account and three linked accounts: Development, Testing, and Production.

Figure 1: Multiple Accounts Owned by a Single Customer

Note that linked accounts are not required to use cross-account roles, but they are often used together. You can use cross-account roles to access accounts that are not part of a consolidated billing relationship or between accounts owned by different companies. See the user guide to learn more about linked accounts and consolidated billing.


Bob, a Windows administrator at Example Corp., is tasked with maintaining an inventory of all the instances in each account. Specifically, he needs to send a list of all EC2 instances in all accounts to the CMDB team each night. He plans to create a Windows PowerShell script to do this.

Bob could create an IAM user in each account and hard-code the credentials in the script. Though this would be simple, hard-coding credentials is not the most secure solution. The AWS best practice is to Use IAM roles. Bob is familiar with IAM roles for Amazon EC2 and wants to learn more about cross-account roles.

Bob plans to script the process shown in Figure 2. The CMDB script will run on an EC2 instance using the CMDBApplication role. For each account, the script will call Use-STSRole to retrieve a set of temporary credentials for the CMDBDiscovery role. The script will then iterate over each region and call Get-EC2Instance using the CMDBDiscovery credentials to access the appropriate account and list all of its instances.

Figure 2: CMDB Application and Supporting IAM Roles

Creating IAM Roles

Bob begins to build his solution by creating the IAM roles shown in Figure 3. The Windows PowerShell script will run on a new EC2 instance in the Payer account. Bob creates a CMDBApplication role in the Payer account. This role in used by the EC2Instance allowing the script to run without requiring IAM user credentials. In addition, Bob will create a CMDBDiscovery role in every account. The CMDBDiscovery role has permission to list (or “Discover”) the instances in that account.

Figure 3: CMDB Application and Supporting IAM Roles

Notice that Bob has created two roles in the Payer account: CMDBApplication and CMDBDiscovery. You may be asking why he needs a cross-account role in the same account as the application. Creating the CMDBDiscovery role in every account makes the code easier to write because all accounts are treated the same. Bob can treat the Payer account just like any of the other accounts without a special code branch.

Bob first creates the Amazon EC2 role, CMDBApplication, in the Payer account. This role will be used by the EC2 instance that runs the Windows PowerShell script. Bob signs in to the AWS Management Console for the Payer account and follows the instructions to create a new IAM Role for Amazon EC2 with the following custom policy.

  "Version": "2012-10-17",
  "Statement": [
      "Effect": "Allow",
      "Action": ["sts:AssumeRole"],
      "Resource": ["*"]

Policy 1: Policy Definition for the CMDBApplication IAM Role

The CMDBApplication role grants a single permission, sts:AssumeRole, which allows the application to call the AssumeRole API to get temporary credentials for another account. Notice that Bob is following the best practice of Least Privilege and has assigned only one permission to the application.

Next, Bob creates a cross-account role called CMDBDiscovery in each of the accounts, including the Payer account. This role will be used to list the EC2 instances in that account. Bob signs in to the console for each account and follows the instructions to create a new IAM role for cross-account access. In the wizard, Bob supplies the account ID of the Payer account (111111111111 in our example) and specifies the following custom policy.

Note that when creating the role, there are two options. One provides access between accounts you own, and the other provides access from a third-party account. Third-party account roles include an external ID, which is outside the scope of this article. Bob chooses the first option because his company owns both accounts.

  "Version": "2012-10-17",
  "Statement": [
      "Effect": "Allow",
      "Action": ["ec2:DescribeInstances"],
      "Resource": ["*"]

Policy 2: Policy Definition for the CMDBDiscovery IAM Role

Again, this policy follows the best practice of least privilege and assigns a single permission,ec2:DescribeInstances, which allows the caller to list the EC2 instances in the account.

Creating the CMDB Script

With the IAM roles created, Bob next launches a new EC2 instance in the Payer account. This instance will use the CMDBApplication role. When the instance is ready, Bob signs in and creates a Windows PowerShell script that will list the instances in each account and region.

The first part of the script, shown in Listing 1, lists the instances in a given region and account. Notice that in addition to the account number and region, the function expects a set of credentials. These credentials represent the CMDBDiscovery role and will be retrieved from the AssumeRole API in the second part of the script.

Function ListInstances {
    Param($Credentials, $Account, $Region)
    #List all instances in the region
    (Get-EC2Instance -Credential $Credentials -Region $Region).Instances | % {
        If($Instance = $_) {
            #If there are instances in this region return the desired attributes
            New-Object PSObject -Property @{
                Account = $Account
                Region = $Region
                InstanceId = $Instance.InstanceId
                Name = ($Instance.Tags | Where-Object {$_.Key -eq 'Name'}).Value

Listing 1: Windows PowerShell Function to List EC2 Instances

The magic happens in the second part of the script, shown in Listing 2. We know that the script is running on the new EC2 instance using the CMDBApplication. Remember that the only thing this role can do is call the AssumeRole API. Therefore, we should expect to see a call to AssumeRole. The Windows PowerShell cmdlet that implements AssumeRole is Use-STSRole.

#List of accounts to check
$Accounts = @(111111111111, 222222222222, 333333333333, 444444444444)
#Iterate over each account
$Accounts | % {
    $Account = $_
    $RoleArn = "arn:aws:iam::${Account}:role/CMDBDiscovery"
    #Request temporary credentials for each account and create a credential object
    $Response = (Use-STSRole -Region $Region -RoleArn $RoleArn -RoleSessionName 'CMDB').Credentials
    $Credentials = New-AWSCredentials -AccessKey $Response.AccessKeyId -SecretKey $Response.SecretAccessKey -SessionToken $Response.SessionToken
    #Iterate over all regions and list instances
    Get-AWSRegion | % {
        ListInstances -Credential $Credentials -Account $Account -Region $_.Region
} | ConvertTo-Csv

Listing 2: Windows PowerShell Script That Calls AssumeRole

Use-STSRole will retrieve temporary credentials for the IAM role specified in the ARN parameter. The ARN uses the following format, where “ROLE_NAME” is the role you created in “TARGET_ACCOUNT_NUMBER” (e.g. CMDBDiscovery).


Use-STSRole will return an AccessKey, a SecretKey, and a SessionToken that can be used to access the account specified in the role ARN. The script uses this information to create a new credential object, which it passes to ListInstances. ListInstances uses the credential object to list EC2 instances in the account specified in the role ARN.

That’s all there is to it. Bob creates a scheduled task that executes this script each night and sends the results to the CMDB team. When the company adds additional accounts, Bob simply adds the CMDBDiscovery role to the new account and updates the account list in his script.


Cross-account roles are a valuable tool for large customers with multiple accounts. Roles provide temporary credentials that a user or application in one account can use to access resources in another account. These temporary credentials do not need to be stored or rotated, resulting in a secure and maintainable architecture.

Alternative Formatting for Metrics Data in .NET SDK Logs

by Jim Flanagan | on | in .NET | Permalink | Comments |  Share

The AWS SDK for .NET has had response logging and performance metrics logging since before version 2.0. We introduced SDK logging and metrics output in an earlier post. You might want to skim that as a refresher.

The metrics data is included in the logs in a human-readable format, but SDK users who aggregate, analyze, and report on this data have to implement their own parser to parse the data out of the logs, which takes time, and can be error prone. So, we’ve added an alternate format to emit the metrics data into the log as JSON.

If you need a format other than JSON, or if you only need to log a subset of the metrics, the SDK now also has a mechanism to add a custom formatter.

Switching to JSON is done through the App.config or Web.config. The SDK’s application configuration has changed a little bit since the aforementioned post; though, for the sake of backwards compatibility, the original configuration settings will still work. To use the JSON setting, however, you’ll have to adopt the new style of configuration, at least for the logging section.

The old style of configuration was a set of flat key-value pairs in the <appSettings> section, like this:

    <add key="AWSLogging" value="SystemDiagnostics" />
    <add key="AWSLogMetrics" value="true" />
    <add key="AWSResponseLogging" value="OnError" />

The new configuration uses a custom section for the SDK with a structured format, like this:

        <section name="aws" type="Amazon.AWSSection, AWSSDK" />
    <aws region="us-west-2">
        <logging logTo="SystemDiagnostics"
             logMetricsFormat="JSON" />

You can see that this configuration selects the JSON formatting. The rest of the logging configuration, including selection of System.Diagnostics or Log4Net, is the same as specified in the introductory logging post.

Creating a custom formatter is easy too. First, you need to implement the Amazon.Runtime.IMetricsFormatter interface, specifying a single method that takes in an Amazon.Runtime.IRequestMetrics and returns a string. Here’s a trivial example that prints out a single metric for a request:

using Amazon.Runtime;
namespace MyLib.Util
    public class MyMetricsFormatter : IMetricsFormatter
        public string FormatMetrics(IRequestMetrics metrics)
            var fmt = string.Empty;
            if (metrics.Timings.ContainsKey(Metric.ResponseProcessingTime))
                var timing = metrics.Timings[Metric.ResponseProcessingTime]

                if (timing != null)
                    fmt = string.Format("ResponseProcessingTime (ms): {0}", 
            return fmt;

The IRequestMetrics has three dictionaries of metrics; Properties, Timings, and Counters. The keys for these dictionaries are defined in the Amazon.Runtime.Metric enum. The Properties and Timings dictionaries have lists as values, and the the Counters dictionary has long as values.

To use a custom formatter, use the logMetricsCustomFormatter configuration, specifying the type and assembly:

<aws region="us-west-2">
    <logging logTo="SystemDiagnostics"
         logMetricsCustomFormater="MyLib.Util.MyMetricsFormatter, MyLib" />

If you want to collect metrics for a subset of services or method calls, your custom formatter can inspect the Metrics.ServiceName and Metrics.MethodName items in the Properties dictionary. The default behavior can be accessed by calling ToString() on the passed in IRequestMetrics. Similarly, you can get the JSON by calling metrics.ToJSON().

Keep in mind that if you have metrics logging enabled and have specified a custom formatter your formatter will be called for every request, so keep it as simple as possible.

Support for Amazon SNS in the Preview Release of AWS Resource APIs for .NET

by Milind Gokarn | on | in .NET | Permalink | Comments |  Share

The latest addition to the AWS Resource APIs for .NET is Amazon Simple Notification Service (SNS). Amazon SNS is a web service that enables applications, end-users, and devices to instantly send and receive notifications. In this post, we’ll see how we can use the resource APIs to work with SNS and to publish messages.


The key concept in SNS is a topic. A topic is something that publishers send messages to and subscribers receive messages from. Let’s take a look at how we can create and use a topic.

using Amazon.SimpleNotificationService.Model;
using Amazon.SimpleNotificationService.Resources; // Namespace for SNS resource APIs

// Create an instance of the SNS service 
// You can also use the overload that accepts an instance of the service client.
var sns = new SimpleNotificationService();

// Create a new topic
var topic = sns.CreateTopic("testTopic");

// Check that the topic is now in the list of all topics
// To do this, we can retrieve a list of all topics and check that.
var exists = sns.GetTopics()
    .Any(t => t.Arn.Equals(topic.Arn));
Console.WriteLine("Topic exists = {0}", exists);

// Modify topic attributes
topic.SetAttributes("DisplayName", "Test Topic");

// Subscribe an email endpoint to the topic
topic.Subscribe("test@example.com", "email");

// Wait until the subscription has been confirmed by the endpoint
// WaitForSubscriptionConfirmation();

// Publish a message to the topic
topic.Publish("Test mesage");

// Delete the topic

// Check that the topic is no longer in the list of all topics
exists = sns.GetTopics()
    .Any(t => t.Arn.Equals(topic.Arn));
Console.WriteLine("Topic exists = {0}", exists);

As you can see, it’s easy to get started with and use the new Amazon SNS Resource APIs to work with the service.

Querying the Public IP Address Ranges for AWS

by Steve Roberts | on | in .NET | Permalink | Comments |  Share

A post on the AWS Official Blog last November noted that the authoritative public IP address ranges used by AWS could now be obtained from a JSON-format file. The same information can now be accessed easily from AWS Tools for Windows PowerShell with a new cmdlet, Get-AWSPublicIpAddressRange, without the need to parse JSON. This cmdlet was added in version

When run with no parameters, the cmdlet outputs all of the address ranges to the pipeline:

PS C:> Get-AWSPublicIpAddressRange

IpPrefix                    Region             Service
--------                    ------             -------                us-east-1          AMAZON              us-east-1          AMAZON
...                us-east-1          EC2             us-east-1          EC2
...            GLOBAL             ROUTE53             sa-east-1          ROUTE53_HEALTHCHECKS
...             GLOBAL             CLOUDFRONT            GLOBAL             CLOUDFRONT

If you’re comfortable using the pipeline to filter output, this may be all you need, but the cmdlet is also able to filter output using the -ServiceKey and -Region parameters. For example you can get the address ranges for EC2 across all regions like this (the parameter value is case insensitive):

PS C:> Get-AWSPublicIpAddressRange -ServiceKey ec2

Similarly, you can get the address ranges used by AWS in a given region:

PS C:> Get-AWSPublicIpAddressRange -Region us-west-2

Both of these parameters accept string arrays and can be supplied together. This example shows how to get the address ranges for Amazon EC2 and Amazon Route53 health checks in both US West regions:

PS C:> Get-AWSPublicIpAddressRange -ServiceKey ec2,route53_healthchecks -Region us-west-1,us-west-2

IpPrefix                    Region              Service
--------                    ------              -------               us-west-1           EC2               us-west-1           EC2
...               us-west-2           EC2               us-west-2           EC2
...             us-west-1           ROUTE53_HEALTHCHECKS             us-west-2           ROUTE53_HEALTHCHECKS            us-west-2           ROUTE53_HEALTHCHECKS           us-west-1           ROUTE53_HEALTHCHECKS

As noted in the original post, this information can change several times per week. You can find the publication date and time of the current information using the -OutputPublicationDate switch. The returned value here is a DateTime object:

PS C:> Get-AWSPublicIpAddressRange -OutputPublicationDate

Monday, December 15, 2014 4:41:01 PM

The set of service keys may change over time (see AWS IP Address Ranges for current documentation on this information). The current set of keys in use in the file can be obtained using the -OutputServiceKeys switch:

PS C:> Get-AWSPublicIpAddressRange -OutputServiceKeys


If you’ve read this far and are thinking that this would also be useful for your C#/.NET applications, then you’ll be glad to know it’s also exposed in the AWS SDK for .NET. See the AWSPublicIpAddressRanges class in the Amazon.Util namespace for more details.

We hope you find this new capability useful in your scripts. If you have ideas for other cmdlets that you would find useful, be sure to leave a comment!

Caching Amazon Cognito Identity IDs

by Norm Johanson | on | in .NET | Permalink | Comments |  Share

Amazon Cognito is a service that you can use to get AWS credentials to your mobile and desktop applications without embedding them in your code. A few months ago, we added a credentials provider for Cognito. In version 2.3.14 of the AWS SDK for .NET, we updated the credentials provider to support caching the identity ID that Cognito creates.

Caching IDs is really useful for mobile and desktop applications where you don’t want to require users to authenticate but need to remember the user for each run of the application. For example, if you have a game whose scores you want to store in Amazon S3, you can use the identity ID as the object key in S3. Then, in future runs of the game, you can use the identity ID to get the scores back from S3. To get the current identity ID, call the GetIdentityId method on the credentials provider. You can also use the identity ID in the AWS Identity and Access Management (IAM) role that Cognito is using to restrict access to only the current user’s score. Below is a policy that shows how to use the Cognito identity ID. In the policy, the variable ${cognito-identity.amazonaws.com:sub} is used. When the policy is evaluated, ${cognito-identity.amazonaws.com:sub} is replaced with the current user’s identity ID.

    "Version" : "2012-10-17",
    "Statement" : [
            "Sid" : "1",
            "Effect" : "Allow",
            "Action" : [
            "Resource" : "*"
            "Sid" : "2",
            "Effect" : "Allow",
            "Action" : ["s3:PutObject", "s3:GetObject"]
            "Resource" : "arn:aws:s3:::my-game-scores-bucket/scores/${cognito-identity.amazonaws.com:sub}.json"

In the Windows Phone and Windows Store version of the SDK, caching is controlled by the IdentityIdCacheMode property on Amazon.CognitoIdentity.CognitoAWSCredentials. By default, this property is set to LocalSettings, which means the identity ID will be cached local to just the device. Windows.Storage.ApplicationData.Current.LocalSettings is used to cache the identity ID. It can also be set to RoamingSettings, which means the identity ID will be stored in Windows.Storage.ApplicationData.Current.RoamingSettings, and the Windows Runtime will sync data stored in this collection to other devices where the user is logged in. To turn off caching, set IdentityIdCacheMode to None.

To enable caching for the .NET 3.5 and 4.5 versions of the SDK, you need to extend the Amazon.CognitoIdentity.CognitoAWSCredentials class and implement the GetCachedIdentityId, CacheIdentityId, and ClearIdentityCache methods.

New AWS Elastic Beanstalk Deployment Wizard

Today, we released version 1.8 of the AWS Toolkit for Visual Studio. For this release, we revamped our wizard to deploy your ASP.NET Applications. Our goal was to make deployment easier as well as take advantage of some of the new features AWS Elastic Beanstalk has added.

What happened to the AWS CloudFormation deployments?

Unlike the new deployment wizard, the previous wizard had the option to deploy using the Load Balanced and Single Instance Templates, which would deploy using AWS CloudFormation templates. This deployment option was added before we had Elastic Beanstalk, which has since added features that make these deployment templates obsolete. If you still need access to this deployment mechanism, on the first page of the new wizard you can choose to relaunch the legacy wizard.

So what’s new?

Rolling deployments

If you are deploying your applications to a load balanced environment, you can configure how new versions of your applications are deployed to the instances in your environment. You can also configure how changes to your environment are made. For example, if you have 4 instances in your environment and you want to change the instance type, you can configure the environment to change 2 instances at a time keeping your application up and running while the change is being made.

AWS Identity and Access Management roles

AWS Identity and Access Management roles are an important way of getting AWS credentials to your deployed application. With the new wizard, you can select an existing role or choose to create a new role based on a number of role templates. It is easy in the new wizard to set up a new role that gives access to Amazon S3 and DynamoDB. After deployment, you can refine the role from the AWS Explorer.

Application options

The application options page has several new features. You can now choose which build configuration to use. You can also set any application settings you want to be pushed into the web.config appSettings section when the application is being deployed.

In the previous deployment wizard, applications were deployed to a sub-folder in IIS based on the project name with the suffix "_deploy". It appeared as if it was deployed at the root because URL rewrite rules were added to the root. This worked for most cases, but there are some edge cases where this caused problems. With the new wizard, applications can be configured to deploy at any folder and by default it will be deployed at the root folder of IIS. If the application is deployed to anywhere other then the root, the URL rewrite rules are added to the root.


We hope that you like the new wizard and that it makes things easier for you. For a full walk through of the new wizard check out the user guide for the AWS Toolkit for Visual Studio. We would love to hear your feedback on the new wizard. We would also love to hear about any interesting deployment issues you have and where you would like help from AWS .NET tooling.


Amazon EC2 ImageUtilities and Get-EC2ImageByName Updates

Versions 2.3.14 of the AWS SDK for .NET and AWS Tools for Windows PowerShell, released today (December 18, 2014), contain updates to the utilities and the Get-EC2ImageByName cmdlet used to query common Microsoft Windows 64-bit Amazon Machine Images using version-independent names. Briefly, we renamed some of the keys used to identify Microsoft Windows Server 2008 images to address confusion over what versions are actually returned, and we added the ability to retrieve some additional images. In the Get-EC2ImageByName cmdlet, we made a small behavior change to help when running the cmdlet in a pipeline when more than one image version exists (as happens when Amazon periodically revises the images) – the cmdlet by default now outputs only the very latest image. The previous behavior that output all available versions (latest + prior) can be enabled using a new switch.

Renamed and New Image Keys

This change affects both the SDK Amazon.EC2.Util.ImageUtilities class and the Get-EC2ImageByName cmdlet. For some time now, the keys prefixed with Windows_2008_* have returned Microsoft Windows Server 2008 R2 images, not the original Windows Server 2008 editions, leading to some confusion. We addressed this by adding a new set of R2-specific keys—these all have the prefix Windows_2008R2_*. To maintain backward compatibility, the SDK retains the old keys, but we have tagged them with the [Obsolete] attribute and a message detailing the corresponding R2-based key you should use. Additionally, these old keys will still return Windows Server 2008 R2 images.

Note that the Get-EC2ImageByName cmdlet will not display the obsolete keys (when run with no parameters), but you can still supply them for the -Name parameter so your existing scripts will continue to function.

We also added three new keys enabling you to retrieve 64-bit Windows Server 2008 SP3 editions (base image, plus SQL Server 2008 Standard and SQL Server 2008 Express images). The keys for these images are WINDOWS_2008RTM_BASE, WINDOWS_2008RTM_SQL_SERVER_EXPRESS_2008, and WINDOWS_2008RTM_SQL_SERVER_STANDARD_2008.

The following keys are displayed when you run the cmdlet with no parameters:

PS C:> Get-EC2ImageByName

The following keys are deprecated but still recognized:


Get-EC2ImageByName Enhancements

Amazon periodically revises the set of Microsoft Windows images that Amazon makes available to customers and for a period, the Get-EC2ImageByName cmdlet could return the latest image for a key, plus one or more prior versions. For example, at the time of writing this post, running the command Get-EC2ImageByName -Name windows_2012r2_base emitted two images as output. If run in a pipeline that then proceeds to invoke the New-EC2Instance cmdlet, for example, instances of multiple images could then be started—perhaps not what was expected. To obtain and start the latest image only, you would have to either index the returned collection, which could contain one or several objects, or insert a call to Select-Object in your pipeline to extract the first item before then calling New-EC2Instance (the first item in the output from Get-EC2ImageByName is always the latest version).

With the new release, when a single key is supplied to the -Name parameter, the cmdlet emits only the single latest machine image that is available. This makes using the cmdlet in a ‘get | start’ pattern much safer and more convenient:

# guaranteed to only return one image to launch
PS C:> Get-EC2ImageByName -Name windows_2012r2_base | New-EC2Instance -InstanceType t1.micro ...

If you do need to get all versions of a given image, this is supported using the new ”-AllAvailable” switch. The following command outputs all available versions of the Windows Server 2012 R2 image, which may be one or several images:

PS C:> Get-EC2ImageByName -Name windows_2012r2_base -AllAvailable

The cmdlet can also emit all available versions when either more than one value is supplied for the -Name parameter or a custom key value is supplied, as it is assumed in these scenarios you are expecting a collection to work with:

# use of multiple keys (custom or built-in) yields all versions
PS C:> Get-EC2ImageByName -Name windows_2012r2_base,windows_2008r2_base

# use of a custom key, single or multiple, yields all versions
PS C:> Get-EC2ImageByName -Name "Windows_Server-2003*"

These updates to the Get-EC2ImageByName cmdlet were driven in part by feedback from our users. If you have an idea or suggestion for new features that would make your scripting life easier, please get in touch with us! One way is via the AWS PowerShell Scripting forum here.

Preview release of AWS Resource APIs for .NET

by Milind Gokarn | on | in .NET | Permalink | Comments |  Share

We have released a preview of AWS Resource APIs for .NET, which is a brand new high-level API. The latest version of the preview ships with the resource APIs for the following AWS services, support for other services will be added in the near future.

  • Amazon Glacier
  • Amazon Simple Notification Service (SNS)
  • Amazon Simple Queue Service (SQS)
  • AWS CloudFormation
  • AWS Identity and Access Management (IAM)

The goal of this preview is to provide early access to the new API and to get feedback from you that we can incorporate in the GA release. The source code for the preview is available as a new branch of the aws-sdk-net GitHub repository, and the binaries are available here.

The resource APIs allows you to work more directly with the resources that are managed by AWS services. A resource is a logical object exposed by an AWS service’s API. For example, User, Group, and Role are some of the resources exposed by the IAM service. Here are the benefits of using the resource APIs :

Easy to understand

The low-level APIs are request-response style APIs that corresponds to the actions exposed by an AWS service. The resource APIs are  higher-level object-oriented APIs that represent the logical relationships between the resources within a service. When you work with a resource object, only the operations and relationships applicable to it are visible, in contrast to the low-level API where you can see all the operations for a service on the service client object. This makes it easier to understand and explore the features of a service.

Write less code

The resource APIs reduces the the amount of code you need to write to achieve the same results.

  • Operations on resource objects infer identifier parameters from its current context. This allows you to write code where you don’t have to specify identifiers repeatedly.

    // No need to specify ResyncMFADeviceRequest.UserName 
    // as it is inferred from the user object
    user.Resync(new ResyncMFADeviceRequest
        SerialNumber = "",
        AuthenticationCode1 ="",
        AuthenticationCode2 = ""
  • Simplified method overloads eliminate creating request objects for commonly used and mandatory request parameters. You can also use the overload, which accepts a request object for complex usages.

    group.AddUser(user.Name); // Use this simplified overload instead of  
    group.AddUser(new AddUserToGroupRequest { UserName = user.Name});  
  • Auto pagination for operations that support paging – The resource APIs will make multiple service calls for APIs that support paging as you enumerate through the results. You do not have to write additional code to make multiple service calls and to capture/resend pagination tokens.

Using the API

The entry point for using the resource APIs is the service object. It represents an AWS service itself, in this case IAM. Using the service object, you can access top-level resources and operations on a service. Once you get the resource objects, further operations can be performed on them. The following code demonstrates various API usages with IAM and resource objects.

using Amazon.IdentityManagement.Model;
using Amazon.IdentityManagement.Resources; // Namespace for IAM resource APIs


// AWS credentials or profile is picked up from app.config 
var iam = new IdentityManagementService();            

// Get a group by its name
var adminGroup = iam.GetGroupByName("admins");

// List all users in the admins group.          
// GetUsers() calls an API that supports paging and 
// automatically makes multiple service calls if
// more results are available as we enumerate
// through the results.
foreach (var user in adminGroup.GetUsers())

// Create a new user and add the user to the admins group
var userA= iam.CreateUser("Alice");

// Create a new access key for a user
var userB = iam.GetUserByName("Bob");
var accessKey = userB.CreateAccessKey();

// Deactivate all MFA devices for a user
var userC = iam.GetUserByName("Charlie");
foreach (var mfaDevice in userC.GetMfaDevices())

// Update an existing policy for a user
var policy = userC.GetUserPolicyByName("S3AccessPolicy");            

The AWS SDK for .NET Developer Guide has code examples and more information about the resource APIs. We would really like to hear your feedback and suggestions about this new API. You can provide your feedback through GitHub and the AWS forums.

DynamoDB JSON Support

by Pavel Safronov | on | in .NET | Permalink | Comments |  Share

The latest Amazon DynamoDB update added support for JSON data, making it easy to store JSON documents in a DynamoDB table while preserving their complex and possibly nested shape. Now, the AWS SDK for .NET has added native JSON support, so you can use raw JSON data when working with DynamoDB. This is especially helpful if your application needs to consume or produce JSON—for instance, if your application is talking to a client-side component that uses JSON to send and receive data—as you no longer need to manually parse or compose this data.

Using the new features

The new JSON functionality is exposed in the AWS SDK for .NET through the Document class:

  • ToJson – This method converts a given Document to its JSON representation
  • FromJson – This method creates a Document for a given JSON string

Here’s a quick example of this feature in action.

// Create a Document from JSON data
var jsonDoc = Document.FromJson(json);

// Use the Document as an attribute
var doc = new Document();
doc["Id"] = 123;
doc["NestedDocument"] = jsonDoc;

// Put the item

// Load the item
doc = table.GetItem(42);

// Convert the Document to JSON
var jsonText = doc.ToJson();
var jsonPrettyText = doc["NestedDocument"].AsDocument().ToJsonPretty();

This example shows how a JSON-based Document can be used as an attribute, but you can also use the converted Document directly, provided that it has the necessary key attributes.
Also note that we have introduced the methods ToJson and ToJsonPretty. The difference between the two is that the latter will produce indented JSON that is easier to read.

JSON types

DynamoDB data types are a superset of JSON data types. This means that all JSON data can be represented as DynamoDB data, while the opposite isn’t true.

So if you perform the conversion JSON -> Document -> JSON, the starting and final JSON will be identical (except for formatting). However, since not all DynamoDB data types can be converted to JSON, the conversion Document -> JSON -> Document may result in a different representation of your data.

The differences between DynamoDB and JSON are:

  • JSON has no sets, just arrays, so DynamoDB sets (SS, NS, and BS types) will be converted to JSON arrays.
  • JSON has no binary representation, so DynamoDB binary scalars and sets (B and BS types) will be converted to base64-encoded JSON strings or lists of strings.

If you do end up with a Document instance that has base64-encoded data, we have provided a method on the Document object to decode this data and replace it with the correct binary representation. Here is a simple example:

doc.DecodeBase64Attributes("Data", "DataSet");

After executing the above code, the "Data" attribute will contain binary data, while the "DataSet" attribute will contain a list of binary data.

I hope you find this feature a useful addition to the AWS SDK for .NET. Please give it a try and let us know what you think on GitHub or here in the comments!