AWS Developer Blog

Creating .NET Core AWS Lambda Projects without Visual Studio

by Norm Johanson | on | in .NET | | Comments

In the last post, we talked about AWS Lambda deployment integration with the dotnet CLI, using the Amazon.Lambda.Tools NuGet package to deploy Lambda functions and serverless applications. But what if you want to create an AWS Lambda project outside of Visual Studio? This is especially important if you’re working on platforms other than Windows.

The “dotnet new” Command

The dotnet CLI has a command named new that you can use to create .NET Core projects from the command line. For example, by default there are options for creating many of the common project types.

C:\BlogContent> dotnet new -all                                                                                                             
Template Instantiation Commands for .NET Core CLI.                                                                            
                                                   
Templates                 Short Name       Language      Tags                                                                 
------------------------------------------------------------------------------------------------------                                                      
Console Application       console          [C#], F#      Common/Console                                                       
Class library             classlib         [C#], F#      Common/Library                                                       
Unit Test Project         mstest           [C#], F#      Test/MSTest                                                          
xUnit Test Project        xunit            [C#], F#      Test/xUnit                                                           
ASP.NET Core Empty        web              [C#]          Web/Empty                                                            
ASP.NET Core Web App      mvc              [C#], F#      Web/MVC                                                              
ASP.NET Core Web API      webapi           [C#]          Web/WebAPI                                                           
Nuget Config              nugetconfig                    Config                                                               
Web Config                webconfig                      Config                                                               
Solution File             sln                            Solution                                                             
                                                                                                                             
Examples:                                                                                                                     
    dotnet new mvc --auth None --framework netcoreapp1.1                                                                      
    dotnet new mvc --framework netcoreapp1.1                                                                                  
    dotnet new --help   

The new command also has the ability to add more project types via NuGet. We recently released a new NuGet package named Amazon.Lambda.Templates that wraps up all the templates we expose in Visual Studio as project types you can create from the dotnet CLI. To install this NuGet package, run the following command.

dotnet new -i Amazon.Lambda.Templates::*

The trailing ::* in the command specifies to install the latest version. Once the install is complete, the Lambda templates show up as part of dotnet new.

C:\BlogContent> dotnet new -all                                                                                                             
Template Instantiation Commands for .NET Core CLI.                                                                            
                                                                                                                             
Templates                            Short Name                    Language      Tags                                         
------------------------------------------------------------------------------------------------------                                                      
Lambda Detect Image Labels           lambda.DetectImageLabels      [C#]          AWS/Lambda/Function                          
Lambda Empty Function                lambda.EmptyFunction          [C#]          AWS/Lambda/Function                          
Lambda Simple DynamoDB Function      lambda.DynamoDB               [C#]          AWS/Lambda/Function                          
Lambda Simple Kinesis Function       lambda.Kinesis                [C#]          AWS/Lambda/Function                          
Lambda Simple S3 Function            lambda.S3                     [C#]          AWS/Lambda/Function                          
Lambda ASP.NET Core Web API          lambda.AspNetCoreWebAPI       [C#]          AWS/Lambda/Serverless                        
Lambda DynamoDB Blog API             lambda.DynamoDBBlogAPI        [C#]          AWS/Lambda/Serverless                        
Lambda Empty Serverless              lambda.EmptyServerless        [C#]          AWS/Lambda/Serverless                        
Console Application                  console                       [C#], F#      Common/Console                               
Class library                        classlib                      [C#], F#      Common/Library                               
Unit Test Project                    mstest                        [C#], F#      Test/MSTest                                  
xUnit Test Project                   xunit                         [C#], F#      Test/xUnit                                   
ASP.NET Core Empty                   web                           [C#]          Web/Empty                                    
ASP.NET Core Web App                 mvc                           [C#], F#      Web/MVC                                      
ASP.NET Core Web API                 webapi                        [C#]          Web/WebAPI                                   
Nuget Config                         nugetconfig                                 Config                                       
Web Config                           webconfig                                   Config                                       
Solution File                        sln                                         Solution                                     
                                                                                                                             
Examples:                                                                                                                     
    dotnet new mvc --auth None --framework netcoreapp1.1                                                                      
    dotnet new classlib                                                                                                       
    dotnet new --help                                                                                                         
C:\BlogContent>  

To get details about a template, you can use the help command.


dotnet new lambda.EmptyFunction –help

C:\BlogContent> dotnet new lambda.EmptyFunction --help                                                                                                    
Template Instantiation Commands for .NET Core CLI.                                                                                          
                                                                                                                                           
Lambda Empty Function (C#)                                                                                                                  
Author: AWS                                                                                                                                 
Options:                                                                                                                                    
  -p|--profile  The AWS credentials profile set in aws-lambda-tools-defaults.json and used as the default profile when interacting with AWS.
                string - Optional                                                                                                           
                                                                                                                                           
  -r|--region   The AWS region set in aws-lambda-tools-defaults.json and used as the default region when interacting with AWS.              
                string - Optional       

You can see here that the template takes two optional parameters to set the profile and region. These values are written to the aws-lambda-tools-default.json so you can get started deploying with the Lambda tooling right away.

To create a function, run the following command.

dotnet new lambda.EmptyFunction --name BlogFunction --profile default --region us-east-2

This creates a project for the Lambda function and a test project. We can now use any editor we want to build and test our .NET Core Lambda function. Once we’re ready to deploy the function, we run the following commands.

cd ./BlogFunction/src/BlogFunction
dotnet restore
dotnet lambda deploy-function BlogFunction –function-role TestRole

After deployment,we can even test the function from the command line by using the following command.

dotnet lambda invoke-function BlogFunction --payload "Hello World"
C:\BlogContent> dotnet lambda invoke-function BlogFunction --payload "Hello World"
Payload:
"HELLO WORLD"

Log Tail:
START RequestId: a54b750b-0dca-11e7-9099-27598ea7c35d Version: $LATEST
END RequestId: a54b750b-0dca-11e7-9099-27598ea7c35d
REPORT RequestId: a54b750b-0dca-11e7-9099-27598ea7c35d  Duration: 0.99 ms       Billed Duration: 100 ms         Memory Size: 256 MB     Max Memory Used: 42 MB

Summary

With our Lambda tooling provided by Amazon.Lambda.Tools and our project templates provided by Amazon.Lambda.Templates, you can develop .NET Core Lambda functions on any platform. As always, let us know what you think on our GitHub repository.

Deploying .NET Core AWS Lambda Functions from the Command Line

by Norm Johanson | on | in .NET | | Comments

In previous posts about our .NET Core support with AWS Lambda, we’ve shown how you can create Lambda functions and serverless applications with Visual Studio. But one of the most exciting things about .NET Core is its cross-platform support with the new command line interface (CLI) named dotnet. To help you develop Lambda functions outside of Visual Studio, we’ve released the Amazon.Lambda.Tools NuGet package that integrates with the dotnet CLI.

We released Amazon.Lambda.Tools as a preview with our initial release of .NET Core on Lambda. We kept it in preview while .NET Core tooling, including the dotnet CLI, was in preview. With the recent release of Visual Studio 2017, the dotnet CLI and our integration with it is now marked as generally available (GA). If you’re still using preview versions of the dotnet CLI and the pre-Visual Studio 2017 project structure, the GA release of Amazon.Lambda.Tools will still work for those projects.

.NET Core Project Structure

When .NET Core was originally released last summer, you would define a project in a JSON file named project.json. At that time, it was announced that this was temporary, that .NET Core was moving to be in line with other .NET projects and would be based on the msbuild XML format, and that each project would contain a .csproj file. As part of the GA release of the dotnet CLI tooling, the release includes the switch to the msbuild format.

Amazon.Lambda.Tools Registration

If you create an AWS Lambda project in Visual Studio, the command line integration is set up automatically so that you can easily transition from Visual Studio to the command line. If you inspect a project created in Visual Studio 2017, you’ll notice a DotNetCliToolReference for the Amazon.Lambda.Tools NuGet package.


<Project Sdk="Microsoft.NET.Sdk">

  <PropertyGroup>
    <TargetFramework>netcoreapp1.0</TargetFramework>
  </PropertyGroup>

  <ItemGroup>
    <PackageReference Include="Amazon.Lambda.Core" Version="1.0.0" />
    <PackageReference Include="Amazon.Lambda.Serialization.Json" Version="1.0.1" />
  </ItemGroup>

  <ItemGroup>
    <DotNetCliToolReference Include="Amazon.Lambda.Tools" Version="1.4.0" />
  </ItemGroup>

</Project>

In Visual Studio 2015, which uses the older project.json format, the Amazon.Lambda.Tools package is declared as a build dependency and is also registered in the tools section.


{
  "version": "1.0.0-*",
  "buildOptions": {
  },

  "dependencies": {
    "Microsoft.NETCore.App": {
      "type": "platform",
      "version": "1.0.0"
    },

    "Amazon.Lambda.Core": "1.0.0*",
    "Amazon.Lambda.Serialization.Json": "1.0.1",

    "Amazon.Lambda.Tools" : {
      "type" :"build",
      "version":"1.4.0 "
    }
  },

  "tools": {
    "Amazon.Lambda.Tools" : "1.4.0 "
  },

  "frameworks": {
    "netcoreapp1.0": {
      "imports": "dnxcore50"
    }
  }
}

Adding to an Existing Project

The Amazon.Lambda.Tools NuGet package is marked as DotNetCliTool package type. Right now Visual Studio 2017 doesn’t understand the new package type. If you attempt to add the NuGet package through Visual Studio’s Manage NuGet Packages dialog it won’t be able to add the reference. Till Visual Studio 2017 is updated you will need to manually add the DotNetCliToolReference in the csproj file.

Deploying Lambda Functions

All the tooling we developed for Visual Studio to deploy Lambda functions originated in the Amazon.Lambda.Tools package. That means all the deployment features you use inside Visual Studio you can also do from the command line.

To get started, in a command window navigate to a project you created in Visual Studio. To see the available commands, enter dotnet lambda help.

C:\BlogContent\BlogExample\BlogExample> dotnet lambda help                                                                                     
AWS Lambda Tools for .NET Core functions                                                                 
Project Home: https://github.com/aws/aws-lambda-dotnet                                                   
                                                                                                        
                                                                                                        
Commands to deploy and manage AWS Lambda functions:                                                      
                                                                                                        
        deploy-function         Command to deploy the project to AWS Lambda                              
        invoke-function         Command to invoke a function in Lambda with an optional input            
        list-functions          Command to list all your Lambda functions                                
        delete-function         Command to delete an AWS Lambda function                                 
        get-function-config     Command to get the current runtime configuration for a Lambda function   
        update-function-config  Command to update the runtime configuration for a Lambda function        
                                                                                                        
                                                                                                        
Commands to deploy and manage AWS Serverless applications using AWS CloudFormation:                      
                                                                                                        
        deploy-serverless       Command to deploy an AWS Serverless application                          
        list-serverless         Command to list all your AWS Serverless applications                     
        delete-serverless       Command to delete an AWS Serverless application                          
                                                                                                        
                                                                                                        
Other Commands:                                                                                          
                                                                                                        
        package                 Command to package a Lambda project into a zip file ready for deployment
                                                                                                        
                                                                                                        
To get help on individual commands execute:                                                              
        dotnet lambda help <command>  

By using the dotnet lambda command you have access to a collection of commands to manage Lambda functions and serverless applications. There is also a package command that packages your project into a .zip file, ready for deployment. This can be useful for CI systems.

To see help for an individual command, type dotnet lambda help followed by the command name; for example, dotnet lambda help deploy-function.

C:\BlogContent\BlogExample\BlogExample> dotnet lambda help deploy-function
AWS Lambda Tools for .NET Core functions
Project Home: https://github.com/aws/aws-lambda-dotnet

deploy-function:
   Command to deploy the project to AWS Lambda

   dotnet lambda deploy-function [arguments] [options]
   Arguments:
      <FUNCTION-NAME> The name of the function to deploy
   Options:
     --region                                The region to connect to AWS services, if not set region will be detected from the environment (Default Value: us-east-2)
      --profile                               Profile to use to look up AWS credentials, if not set environment credentials will be used (Default Value: normj+vpc)
     --profile-location                      Optional override to the search location for Profiles, points at a shared credentials file
      -pl    | --project-location             The location of the project, if not set the current directory will be assumed
      -cfg   | --config-file                  Configuration file storing default values for command line arguments. Default is aws-lambda-tools-defaults.json
      -c     | --configuration                Configuration to build with, for example Release or Debug (Default Value: Release)
      -f     | --framework                    Target framework to compile, for example netcoreapp1.0 (Default Value: netcoreapp1.0)
      -pac   | --package                      Application package to use for deployment, skips building the project
      -fn    | --function-name                AWS Lambda function name
      -fd    | --function-description         AWS Lambda function description
      -fp    | --function-publish             Publish a new version as an atomic operation
      -fh    | --function-handler             Handler for the function <assembly>::<type>::<method> (Default Value: BlogExample::BlogExample.Function::FunctionHandler)
      -fms   | --function-memory-size         The amount of memory, in MB, your Lambda function is given (Default Value: 256)
      -frole | --function-role                The IAM role that Lambda assumes when it executes your function
      -ft    | --function-timeout             The function execution timeout in seconds (Default Value: 30)
      -frun  | --function-runtime             The runtime environment for the Lambda function (Default Value: dotnetcore1.0)
      -fsub  | --function-subnets             Comma delimited list of subnet ids if your function references resources in a VPC
      -fsec  | --function-security-groups     Comma delimited list of security group ids if your function references resources in a VPC
      -dlta  | --dead-letter-target-arn       Target ARN of an SNS topic or SQS Queue for the Dead Letter Queue
      -ev    | --environment-variables        Environment variables set for the function. Format is <key1>=<value1>;<key2>=<value2>
      -kk    | --kms-key                      KMS Key ARN of a customer key used to encrypt the function's environment variables
      -sb    | --s3-bucket                    S3 bucket to upload the build output
      -sp    | --s3-prefix                    S3 prefix for for the build output
      -pcfg  | --persist-config-file          If true the arguments used for a successful deployment are persisted to a config file. Default config file is aws-lambda-tools-defaults.json
C:\BlogContent\BlogExample\BlogExample>

As you can see, you can set many options with this command. This is where the aws-lambda-tools-defaults.json file, which is created as part of your project, comes in handy. You can set the options in this file, which is read by the Lambda tooling by default. The project templates created in Visual Studio set many of these fields with default values.


{                                                                                   
  "profile":"default",                                                            
  "region" : "us-east-2",                                                           
  "configuration" : "Release",                                                      
  "framework" : "netcoreapp1.0",                                                    
  "function-runtime":"dotnetcore1.0",                                               
  "function-memory-size" : 256,                                                     
  "function-timeout" : 30,                                                          
  "function-handler" : "BlogExample::BlogExample.Function::FunctionHandler"         
}

When you use this aws-lambda-tools-default.json file, the only things left that the Lambda tooling needs to deploy the function are the name of the Lambda function and the IAM role. You do this by using the following command:

dotnet lambda deploy-function TheFunction --function-role TestRole
C:\BlogContent\BlogExample\BlogExample> dotnet lambda deploy-function TheFunction --function-role TestRole                                                                                                  
Executing publish command                                                                                                                                                           
Deleted previous publish folder                                                                                                                                                     
... invoking 'dotnet publish', working folder 'C:\BlogContent\BlogExample\BlogExample\bin\Release\netcoreapp1.0\publish'                                                            
... publish: Microsoft (R) Build Engine version 15.1.548.43366                                                                                                                      
... publish: Copyright (C) Microsoft Corporation. All rights reserved.                                                                                                              
... publish:   BlogExample -> C:\BlogContent\BlogExample\BlogExample\bin\Release\netcoreapp1.0\BlogExample.dll                                                                      
Zipping publish folder C:\BlogContent\BlogExample\BlogExample\bin\Release\netcoreapp1.0\publish to C:\BlogContent\BlogExample\BlogExample\bin\Release\netcoreapp1.0\BlogExample.zip
... zipping: Amazon.Lambda.Core.dll                                                                                                                                                 
... zipping: Amazon.Lambda.Serialization.Json.dll                                                                                                                                   
... zipping: BlogExample.deps.json                                                                                                                                                  
... zipping: BlogExample.dll                                                                                                                                                        
... zipping: BlogExample.pdb                                                                                                                                                        
... zipping: Newtonsoft.Json.dll                                                                                                                                                    
... zipping: System.Runtime.Serialization.Primitives.dll                                                                                                                            
Creating new Lambda function TheFunction                                                                                                                                            
New Lambda function created                                                                                                                                                         
C:\BlogContent\BlogExample\BlogExample>             

You can also pass an alternative file that contains option defaults by using the –config-file option. This enables you to reuse multiple Lambda configurations.

“dotnet publish” vs “dotnet lambda” Commands

Using the Amazon.Lambda.Tools package is the preferred way to deploy functions to Lambda from the command line versus using the dotnet publish command, zipping that output folder, and sending the zip file to Lambda. The Lambda tooling looks at the publish folder and removes any duplicate native dependency in it, which reduces the size of your Lambda function. For example, if you reference the SQL Server client NuGet package, System.Data.SqlClient, the Lambda tooling produces a package file that is about 1 MB smaller than the zipped publish folder from dotnet publish. It also reworks the layout of native dependencies to ensure that the Lambda service finds the native dependencies.

Summary

We hope the Amazon.Lambda.Tools package helps you with the transition from working in Visual Studio to working in the command line to script and automate your deployments. Let us know what you think on our GitHub repository, and what you’d like to see us add to the tooling.

AWS SDK for .NET Supports Assume Role Profiles and the Shared Credentials File

by John Vellozzi | on | in .NET | | Comments

The AWS SDK for .NET, AWS Tools for PowerShell, and the AWS Toolkit for Visual Studio now support the use of the AWS CLI credentials file. Some of the AWS SDKs have supported shared use of the AWS CLI credentials file for some time, and we’re happy to add the SDK for .NET to that list.

For a long time, the SDK for .NET has supported reading and writing of its own credentials file. We’ve added support for new credential profile types to facilitate feature parity with the shared credentials file. The SDK for .NET and Tools for PowerShell now support reading and writing of basic, session, and assume role credential profiles to both the .NET credentials file and the shared credentials file. The .NET credentials file maintains its support for federated credential profiles.

With the new Amazon.Runtime.CredentialManagement namespace, you now have programmatic access to read and write credential profiles to the .NET credentials file and the shared credentials file. This is a new namespace, and some older classes have been deprecated. Please see the developer guide topic Configuring AWS Credentials and the API Reference for details.

AWS Tools for PowerShell now enable you to read and write credential profiles to both credentials files as well. We’ve added parameters to the credentials-related cmdlets to support the new profile types and the shared credentials file. You can reference the new profiles with the -ProfileName argument in the service cmdlets. You can find more details about the changes to Tools for PowerShell in Shared Credentials in AWS Tools for PowerShell and the AWS Tools for PowerShell Cmdlet Reference.

In Visual Studio you’ll now see profiles stored in (user’s home directory)\.aws\credentials listed in the AWS Explorer. Reading is supported for all profile types and you can edit basic profiles.

What You Need to Know

In addition to the new Amazon.Runtime.CredentialManagement classes, the SDK has some internal changes. The SDK’s region resolution logic now looks for the region in the default credential profile. This is especially important for SDK for .NET applications running in Amazon EC2. The SDK for .NET determines the region for a request from:

  1. The client configuration, or what is explicitly set on the AWS service client.
  2. The AWSConfigs.RegionEndpoint property (set explicitly or in AppConfig).
  3. The AWS_REGION environment variable, if it’s non-empty.
  4. The “default” credential profile. (See “Credential Profile Resolution” below for details.).
  5. EC2 instance metadata.

Checking the “default” credential profile is a new step in the process. If your application relies on EC2 instance metadata for the region, ensure that the SDK doesn’t pick up the wrong region from one of the credentials files.

Although there aren’t any changes to the credentials resolution logic, it’s important to understand how credential profiles fit into that as well. The SDK for .NET will (continue to) determine the credentials to use for service requests from:

  1. The client configuration, or what is explicitly set on the AWS service client.
  2. BasicAWSCredentials that are created from the AWSAccessKey and AWSSecretKey AppConfig values, if they’re available.
  3. A search for a credentials profile with a name specified by a value in AWSConfigs.AWSProfileName (set explicitly or in AppConfig). (See “Credential Profile Resolution” below for details.)
  4. The “default” credentials profile. (See “Credential Profile Resolution” below for details.)
  5. SessionAWSCredentials that are created from the AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, and AWS_SESSION_TOKEN environment variables, if they’re all non-empty.
  6. BasicAWSCredentials that are created from the AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY environment variables, if they’re both non-empty.
  7. EC2 instance metadata.

Credential Profile Resolution

With two different credentials file types, it’s important to understand how to configure the SDK and Tools for PowerShell to use them. The AWSConfigs.AWSProfilesLocation (set explicitly or in AppConfig) controls how the SDK finds credential profiles. The -ProfileLocation command line argument controls how Tools for PowerShell find a profile. Here’s how the configuration works in both cases:

Profile Location Value Profile Resolution Behavior
null (not set) or empty *First search the .NET credentials file for a profile with the specified name. If the profile isn’t there, search (user’s home directory)\.aws\credentials. If the profile isn’t there, search (user’s home directory)\.aws\config.
The path to a file in the shared credentials file format Search only the specified file for a profile with the specified name.

*The .NET credentials file is not supported on Mac and Linux platforms, and is skipped when resolving credential profiles.

Time-to-Live Support in Amazon DynamoDB

by Pavel Safronov | on | in .NET | | Comments

Amazon DynamoDB recently added Time-to-Live (TTL) support, a way to automatically delete expired items from your DynamoDB table. This blog post discusses this feature, how it’s exposed in the AWS SDK for .NET, and how you can take advantage of it.

Using Time-to-Live

At a high-level, you configure TTL by choosing a particular attribute on a table that will be treated as a timestamp. Then you simply store an expiration time into this attribute on every item that you need to expire. A periodic process in DynamoDB identifies whether an item’s TTL timestamp attribute is now in the past, and then schedules the removal of that item from the table. The timestamps must be stored as epoch seconds (number of seconds since 12:00:00 AM January 1st, 1970 UTC), which you must calculate or have the SDK calculate for you.

The AWS SDK for .NET has three different DynamoDB APIs, so you have three different ways to use TTL. In the following sections, we discuss these APIs and how you use the TTL feature from each of them.

Low-Level Model – Control Plane

First, the low-level model. This is a thin wrapper around the DynamoDB service operations that you use by instantiating AmazonDynamoDBClient and calling its various operations. This model provides you with the most control, but it also doesn’t have the helpful abstractions of the higher-level APIs. Using the low-level model, you can enable and disable the TTL feature and configure Time-to-Live for your data.

Here’s an example of checking the status of TTL for a table.


using (var client = new AmazonDynamoDBClient())
{
    // Retrieve TTL status
    var ttl = client.DescribeTimeToLive(new DescribeTimeToLiveRequest
    {
        TableName = "SessionData"
    }).TimeToLiveDescription;
    Console.WriteLine($"TTL status = {ttl.TimeToLiveStatus}");
    Console.WriteLine($"TTL attribute {(ttl.AttributeName == null ? "has not been set" : $"= {ttl.AttributeName}")}");

    // Enable TTL
    client.UpdateTimeToLive(new UpdateTimeToLiveRequest
    {
        TableName = "SessionData",
        TimeToLiveSpecification = new TimeToLiveSpecification
        {
            Enabled = true,
            AttributeName = "ExpirationTime"
        }
    });

    // Disable TTL
    client.UpdateTimeToLive(new UpdateTimeToLiveRequest
    {
        TableName = "SessionData",
        TimeToLiveSpecification = new TimeToLiveSpecification
        {
            Enabled = false,
            AttributeName = "ExpirationTime"
        }
    });
}

Note: There is a limit to how often you can enable or disable TTL in a given period of time. Running this sample multiple times will likely result in a ValidationException being thrown.

Low Level – Data Plane

Actually writing and reading TTL data in an item is fairly straightforward, but you are required to write epoch seconds into an AttributeValue. You can calculate the epoch seconds manually or use helper methods in AWSSDKUtils, as shown below.

Here’s an example of using the low-level API to work with TTL data.


using (var client = new AmazonDynamoDBClient())
{
    // Writing TTL attribute
    DateTime expirationTime = DateTime.Now.AddDays(7);
    Console.WriteLine($"Storing expiration time = {expirationTime}");
    int epochSeconds = AWSSDKUtils.ConvertToUnixEpochSeconds(expirationTime);
    client.PutItem("SessionData", new Dictionary<string, AttributeValue>
    {
        { "UserName", new AttributeValue { S = "user1" } },
        { "ExpirationTime", new AttributeValue { N = epochSeconds.ToString() } }
    });

    // Reading TTL attribute
    var item = client.GetItem("SessionData", new Dictionary<string, AttributeValue>
    {
        { "UserName", new AttributeValue { S = "user1" } },
    }).Item;
    string epochSecondsString = item["ExpirationTime"].N;
    epochSeconds = int.Parse(epochSecondsString);
    expirationTime = AWSSDKUtils.ConvertFromUnixEpochSeconds(epochSeconds);
    Console.WriteLine($"Stored expiration time = {expirationTime}");
}

Document Model

The Document Model provides you with Table objects that represent a DynamoDB table, and Document objects that represent a single row of data in a table. You can store primitive .NET types directly in a Document, with the required conversion to DynamoDB types happening in the background. This makes the Document Model API easier to use than the low-level model.

Using the Document Model API, you can easily configure which attributes you’d like to store as epoch seconds by setting the TableConfig.AttributesToStoreAsEpoch collection. Then you can use DateTime objects without needing to convert the data to epoch seconds manually. If you don’t specify which attributes to store as epoch seconds, then instead of writing epoch seconds in that attribute you would end up storing the DateTime as an ISO-8601 string, such as “2017-03-09T05:49:38.631Z”. In that case, DynamoDB Time-to-Live would NOT automatically delete the item. So you need to be sure to specify AttributesToStoreAsEpoch correctly when you’re creating the Table object.

Here’s an example of configuring the Table object, then writing and reading TTL items.


// Set up the Table object
var tableConfig = new TableConfig("SessionData")
{
    AttributesToStoreAsEpoch = new List { "ExpirationTime" }
};
var table = Table.LoadTable(client, tableConfig);

// Write TTL data
var doc = new Document();
doc["UserName"] = "user2";

DateTime expirationTime = DateTime.Now.AddDays(7);
Console.WriteLine($"Storing expiration time = {expirationTime}");
doc["ExpirationTime"] = expirationTime;

table.PutItem(doc);

// Read TTL data
doc = table.GetItem("user2");
expirationTime = doc["ExpirationTime"].AsDateTime();
Console.WriteLine($"Stored expiration time = {expirationTime}");

Object Persistence Model

The Object Persistence Model simplifies interaction with DynamoDB even more, by enabling you to use .NET classes with DynamoDB. This interaction is done by passing objects to the DynamoDBContext, which handles all the conversion logic. Using TTL with the Object Persistence Model is just as straightforward as using it with the Document model: you simply identify the attributes to store as epoch seconds and the SDK performs the required conversions for you.

Consider the following class definition.


[DynamoDBTable("SessionData")]
public class User
{
    [DynamoDBHashKey]
    public string UserName { get; set; }

    [DynamoDBProperty(StoreAsEpoch = true)]
    public DateTime ExpirationTime { get; set; }
}

Once we’ve added the [DynamoDBProperty(StoreAsEpoch = true)] attribute, we can use DateTime objects with the class just like we normally would. However, this time we store epoch seconds, and the items we create are eligible for TTL automatic deletion. And just as with the Document Model, if you omit the StoreAsEpoch attribution, the objects you write will contain ISO-8601 dates and are not eligible for TTL deletion.

Here’s an example of creating the DynamoDBContext object, writing a User object, and reading it out again.


using (var context = new DynamoDBContext(client))
{
    // Writing TTL data
    DateTime expirationTime = DateTime.Now.AddDays(7);
    Console.WriteLine($"Storing expiration time = {expirationTime}");

    var user = new User
    {
        UserName = "user3",
        ExpirationTime = expirationTime
    };
    context.Save(user);

    // Reading TTL data
    user = context.Load("user3");
    expirationTime = user.ExpirationTime;
    Console.WriteLine($"Stored expiration time = {expirationTime}");
}

Conclusion

In this blog post we showed how you can toggle the new Time-to-Live feature for a table. We’ve also showed multiple ways to work with this data. The approach you choose is up to you and, hopefully with these examples, you’ll find it quite easy to schedule your data for automatic deletion. Happy coding!

Client Constructors Now Deprecated in the AWS SDK for Java

by Kyle Thomson | on | in Java | | Comments

A couple of weeks ago you might have noticed that the 1.11.84 version of the AWS SDK for Java included several deprecations – the most notable being the deprecation of the client constructors.

Historically, you’ve been able to create a service client as shown here.

AmazonSNS sns = new AmazonSNSClient();

This mechanism is now deprecated in favor of using one of the builders to create the client as shown here.

AmazonSNS sns = AmazonSNSClient.builder().build();

The client builders (described in detail in this post) are superior to the basic constructors in the following ways.

Immutable

Clients created via the builder are immutable. The region/endpoint (and other data) can’t be changed. Therefore, clients are safe to reuse across multiple threads.

Explicit Region

At build time, the AWS SDK for Java can validate that a client has all the required information to function correctly – namely, a region. A client created via the builders must have a region that is defined either explicitly (i.e. by calling withRegion) or as part of the DefaultAwsRegionProviderChain. If the builder can’t determine the region for a client, an SdkClientException is thrown. Region is an important concept when communicating with services in AWS. It not only determines where your request will go, but also how it is signed. Requiring a region means the SDK can behave predictably without depending on hidden defaults.

Cleaner

Using the builder allows a client to be constructed in a single statement using method chaining.

AmazonSNS sns = AmazonSNSClient.builder()
						.withRegion("us-west-1")
						.withClientConfiguration(cfg)
						.withCredentials(creds)
						.build();

The deprecated constructors are no longer created for new service clients. They will be removed from existing clients in a future major version bump (although they’ll remain in all future releases of the 1.x family of the AWS SDK for Java).

AWS Toolkit for Eclipse: Support for Creating Maven Projects for AWS, Lambda, and Serverless Applications

by Zhaoxi Zhang | on | in Java | | Comments

I’m glad to announce that you can now leverage the AWS Toolkit for Eclipse to create Maven projects for AWS, Lambda, and serverless applications now. If you’re new to using the AWS Toolkit for Eclipse to create a Lambda application, you can see the Lambda plugin for more information. If you’re not familiar with serverless applications, see the Serverless Application Model for more information. If you have been using the AWS Toolkit for Eclipse, you’ll notice the extra Maven configuration panel in the user interface where you can create a new AWS, Lambda, or serverless application (see the following screenshots).

The AWS Toolkit for Eclipse no longer downloads the archived AWS Java SDK ZIP file automatically and puts it in the class path for your AWS application. Instead, it manages the dependencies for using Maven by checking for the latest AWS Java SDK version from the remote Maven repository and downloading it automatically, if you don’t already have it installed in your local Maven repository. This means that if a new version of the AWS SDK for Java released, it can take a while to download it before you can create the new application.

Create a Maven Project for an AWS application

In the Eclipse toolbar, choose the AWS icon drop-down button, and  then choose New AWS Project. You’ll see the following page, where you can configure the AWS SDK for Java samples you want to include in your application.

Sample

Here is the structure of the newly created AWS application Java project. You can edit the pom.xml file later to meet your needs to build, test, and deploy your application with Maven.

SampleStructure

Create a Maven Project for a Lambda Application

Similarly to how you create a new AWS application project, you can create a new AWS Lambda project.  In the Eclipse toolbar, choose the AWS icon drop-down button, and then choose New AWS Lambda Java Project.

Lambda

Here is the structure of the newly created AWS Lambda Java project.

LambdaStructure

Create a Maven Project for a  Serverless Application

To create a new serverless application, choose the AWS icon drop-down button and then choose New AWS Serverless Project. The following screenshot shows the status of the  project creation in the process of downloading application dependencies by Maven.

CreatingServerless

Here is the structure of the newly created serverless application Java project.

ArticleStructure

Build a Serverless Application Locally with Maven

You can also use the Maven command-line in the terminal to build and test the project you just created, as shown in the following screenshot.

MavenCommandLine

Please let us know what you think of the new Maven support in the AWS Toolkit for Eclipse. We appreciate your comments.

Preview of the AWS Toolkit for Visual Studio 2017

by Norm Johanson | on | in .NET | | Comments

Today we released a preview of our AWS Toolkit for Visual Studio that includes support for the release candidate (RC) version of Visual Studio 2017. Because this Visual Studio release contains some significant changes for extension developers, we’re making this preview available in advance of the formal release. We highly encourage you to pass along feedback about any issues you find or whether you were successful using the preview by adding to the GitHub issue we describe below.

AWS Lambda .NET Core Support

Visual Studio 2017 also contains support for the new MSBuild project system for .NET Core projects. With this preview of the toolkit, we’ve updated the .NET Core Lambda support to use the new build system. For existing Lambda projects based on the Visual Studio 2015 project.json-based build system, Visual Studio 2017 offers to migrate them to the new build system when you open the projects.

Downloading and Installing the Preview

The AWS Toolkit for Visual Studio contains MSBuild target files for AWS CloudFormation projects. In previous releases of Visual Studio, you had to install these files and dependencies for an extension outside of the Visual Studio folder hierarchy. This required us to use a Windows Installer to install the toolkit. Starting with Visual Studio 2017, these MSBuild extensions exist within the Visual Studio folder hierarchy and can be installed from the VSIX extension package without an installer. As the installer technology we use doesn’t yet support Visual Studio 2017, we’ve decided to distribute the preview as a VSIX file only.

To install the preview

  1. Download the VSIX file from the preview link
  2. Double-click the VSIX file. This launches the VSIX Installer process, as shown.

After the installer finishes, the toolkit functions as it has in previous versions of Visual Studio.

Feedback

To track issues with installing and using the toolkit in Visual Studio 2017, we opened the following GitHub issue in our AWS SDK for .NET repository. For any issues or comments about our Visual Studio 2017 support, please add to the issue. Also, by letting us know if everything is working as you expected, you help us evaluate the readiness of our support for Visual Studio 2017.

Assume AWS IAM Roles with MFA Using the AWS SDK for Go

by Jason Del Ponte | on | in Go | | Comments

AWS SDK for Go v1.7.0 added the feature allowing your code to assume AWS Identity and Access Management (IAM) roles with Multi Factor Authentication (MFA). This feature allows your applications to easily support users assuming IAM roles with MFA token codes with minimal setup and configuration.

IAM roles enable you to manage granular permissions for a specific role or task, instead of applying those permissions directly to users and groups. Roles create a layer of separation, decoupling resource permissions from users, groups, and other accounts. With IAM roles you give third-party AWS accounts access to your resources, without having to create additional users for them in your AWS account.

Assuming IAM roles with MFA is a pattern used for roles that will be assumed by applications used directly by users instead of automated systems such as services. You can require that users assuming your role specify an MFA token code each time the role is assumed. The AWS SDK for Go now makes this easier to support in your Go applications.

Setting Up an IAM Role and User for MFA

To take advantage of this feature, enable MFA for your users and IAM roles. There are two categories of MFA that IAM supports, Security Token and SMS Text Message. The SDK support for MFA takes advantage of the Security Token based method. In the security token category, there are two types of security token devices, Hardware MFA device and Virtual MFA device. The AWS SDK for Go supports both of these device types equally.

In order for a user to assume an IAM role with MFA there must be an MFA device linked with the user. You can do this via the IAM console on the Security credentials tab of a user’s details, and using the Assigned MFA device field. Here you can assign an MFA device to a user. Only one MFA device can be assigned per user.

You can also configure IAM roles to require users who assume those roles to do so using an MFA token. This feature is enabled in the Trust Relationship section of a role’s details. Use the MultiFactorAuthPreset flag to require that any user who assumes the role must do so with an MFA token.

The following is an example of a Trust Relationship that enables this restriction.

{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Effect": "Allow",
      "Principal": {
        "AWS": "arn:aws:iam::<account>:root"
      },
      "Action": "sts:AssumeRole",
      "Condition": {
        "Bool": {
          "aws:MultiFactorAuthPresent": "true"
        }
      }
    }
  ]
}

Assuming a Role with SDK Session

A common practice when using the AWS SDK for Go is to specify credentials and configuration in files such as the shared configuration file (~/.aws/config) and the shared credentials file (~/.aws/credentials). The SDK’s session package makes using these configurations easy, and will automatically configure  service clients based on them. You can enable the SDK’s support for assuming a role and the shared configuration file by setting the environment variable AWS_SDK_LOAD_CONFIG=1, or the session option SharedConfigState to SharedConfigEnable.

To configure your configuration profile to assume an IAM role with MFA, you need to specify the MFA device’s serial number for a Hardware MFA device, or ARN for a Virtual MFA device (mfa_serial). This is in addition to specifying the role’s ARN (role_arn) in your SDK shared configuration file.

The following example profile instructs the SDK to assume a role and requires the user to provide an MFA token to assume the role. The SDK uses the source_profile field to look up another profile in the configuration file that can specify the credentials, and region with which to make the AWS Security Token Service (STS) Assume Role API operation call.

The SDK supports assuming an IAM role with and without MFA. To assume a role without MFA, don’t provide the mfa_serial field.

[profile assume_role_profile]
role_arn = arn:aws:iam::<account_number>:role/<role_name>
source_profile = other_profile
mfa_serial = <hardware device serial number or virtual device arn>

See the SDK’s session package documentation for more details about configuring the shared configuration files.

After you’ve updated your shared configuration file, you can update your application code’s Sessions to specify how the MFA token code is retrieved from your application’s users. If a shared configuration profile specifies a role to assume, and the mfa_serial field is provided, the SDK requires that the AssumeRoleTokenProvider session option is also set. There’s no harm in always setting the AssumeRoleTokenProvider session for applications that will always be run by a person. The field is only used if the shared configuration’s profile has a role to assume, and then sets the mfa_serial field. Otherwise, the option is ignored.

The SDK doesn’t automatically set the AssumeRoleTokenProvider with a default value. This is because of the risk of halting an application unexpectedly while the token provider waits for a nonexistent user to provide a value due to a configuration change. You must set this value to use MFA roles with the SDK.

The SDK implements a simple token provider in the stscreds package, StdinTokenProvider. This function prompts on stdin for an MFA token, and waits forever until one is provided. You can also easily implement a custom token provider by satisfying the func() (string, error) signature. The returned string is the MFA token, and the error is any error that occurred while retrieving the token.

// Enable SDK's Shared Config support.
sess := session.Must(session.NewSessionWithOptions(session.Options{
    AssumeRoleTokenProvider: stscreds.StdinTokenProvider,
    SharedConfigState: session.SharedConfigEnable,
}))

// Use the session to create service clients and make API operation calls.
svc := s3.New(sess)
svc.PutObject(...)

Configuring the Assume Role Credentials Provider Directly

In addition to being able to create a Session configured to assume an IAM role, you can also create a credential provider to assume a role directly. This is helpful when the role’s configuration isn’t stored in the shared configuration files.

Creating the credential provider is similar to configuring a Session. However, you don’t need to enable the session’s shared configuration option. In addition, you can use this to configure service clients to use the assumed role directly instead of via the shared session. This is helpful when you want to shared base configuration across multiple service clients via the Session, and use roles for select tasks.

// Initial credentials loaded from SDK's default credential chain, such as
// the environment, shared credentials (~/.aws/credentials), or EC2 Instance
// Role. These credentials are used to make the AWS STS Assume Role API.
sess := session.Must(session.NewSession())

// Create the credentials from AssumeRoleProvider to assume the role
// referenced by the "myRoleARN" ARN. Prompt for MFA token from stdin.
creds := stscreds.NewCredentials(sess, "myRoleArn", func(p *stscreds.AssumeRoleProvider) {
    p.SerialNumber = aws.String("myTokenSerialNumber")
    p.TokenProvider = stscreds.StdinTokenProvider
})

// Create an Amazon SQS service client with the Session's default configuration.
sqsSvc := sqs.New(sess)

// Create service client configured for credentials from the assumed role.
s3Svc := s3.New(sess, &aws.Config{Credentials: creds})

Feedback

We’re always looking for more feedback. We added this feature as a direct result of feedback and requests we received. If you have any ideas that you think would be good improvements or additions to the AWS SDK for Go, please let us know.

Chalice Version 0.6.0 is Now Available

by James Saryerwinnie | on | in Python | | Comments

The latest preview version of Chalice, our microframework for Python serverless application development, now includes a couple of commonly requested features:

  • Customizing the HTTP response. A new Response class, chalice.Response, enables you to customize the HTTP response by specifying the status code, body, and a mapping of HTTP headers to return. The tutorial in the chalice documentation shows how to use this new functionality to return a non-JSON response to the user.
  • Vendoring binary packages. You can create a top-level vendor/ directory in your application source directory. This vendor directory is automatically included as part of the AWS Lambda deployment package when you deploy your application. You can use this feature for any private Python packages that can’t be specified in your requirements.txt file, as well as any binary content that includes Python packages with C extensions. For more information, see the packaging docs.

Let’s look at the first feature in more detail.

Customizing the HTTP Response

The following example shows a view function that returns a plain text response to the user.

from chalice import Chalice, Response

app = Chalice(app_name='helloworld')

@app.route('/')
def hello_world():
    return Response(
        status_code=200,
        body='hello world',
        headers={'Content-Type': 'text/plain'})

The existing default behavior of returning a JSON response is still preserved. To return a JSON response, you can just return the equivalent Python value directly from your view function.

from chalice import Chalice, Response

app = Chalice(app_name='helloworld')

@app.route('/')
def hello_world():
    return {'hello': 'world'}

You can also use this chalice.Response classto return HTTP redirects to users. In this view function, we accept a URL in the response body and generate a redirect to that URL:

from chalice import Chalice, Response

app = Chalice(app_name='redirect')

@app.route('/redirect', content_types=['text/plain'])
def hello_world():
    url = app.current_request.raw_body.strip()
    return Response(
        status_code=301,
        body='',
        headers={'Location': url})

See the 0.6.0 upgrade notes for more information.

Try out the latest version of Chalice today and let us know what you think. You can chat with us on our gitter channel and file feature requests on our github repo. We look forward to your feedback and suggestions.

Cross-Platform Text-to-Speech for C++ with Amazon Polly

by Jonathan Henson | on | in C++ | | Comments

Amazon Polly launched at re:invent 2016. Because C++ allows us direct access to sound drivers, we decided to try using Amazon Polly for cross-platform text-to-speech applications. The result of our experiment is the new text-to-speech library for the AWS SDK for C++.

Let’s look at some Code Examples.

List available output devices


#include <aws/core/Aws.h>
#include <aws/text-to-speech/TextToSpeechManager.h>
#include <iostream>

using namespace Aws::Polly;
using namespace Aws::TextToSpeech;

static const char* ALLOCATION_TAG = "PollySample::Main";

int main()
{
	Aws::SDKOptions options;
	Aws::InitAPI(options);
	{
	    auto client = Aws::MakeShared<PollyClient>(ALLOCATION_TAG);
	    TextToSpeechManager manager(client);
		
        std::cout << "available devices are: " << std::endl;
		auto devices = manager.EnumerateDevices();

	    for (auto& device : devices)
	    {
		    std::cout << "[" << device.first.deviceId << "] " << device.first.deviceName << "   Driver: "
			    << device.second->GetName() << std::endl;
	    }
	}
	Aws::ShutdownAPI(options);
	return 0;
}


Here, the manager lists all output devices and drivers that are installed by default on your system. Then you can iterate those devices and select the best output device for your application.

List available voices


#include <aws/core/Aws.h>
#include <aws/text-to-speech/TextToSpeechManager.h>
#include <iostream>

using namespace Aws::Polly;
using namespace Aws::TextToSpeech;

static const char* ALLOCATION_TAG = "PollySample::Main";

int main()
{
	Aws::SDKOptions options;
	Aws::InitAPI(options);
	{
	    auto client = Aws::MakeShared<PollyClient>(ALLOCATION_TAG);
	    TextToSpeechManager manager(client);
		
        std::cout << "available voices are: " << std::endl;
	    for (auto& voice : manager.ListAvailableVoices())
	    {
		    std::cout << voice.first << "    language: " << voice.second << std::endl;
	    }
	}
	Aws::ShutdownAPI(options);
	return 0;
}


In this example, the manager lists all available voices from Amazon Polly and lists them to the standard output.

Finally, after we’ve selected an audio output device and a voice to use, we can send text from Amazon Polly. The text will be played directly to our audio output.


#include <aws/core/Aws.h>
#include <aws/text-to-speech/TextToSpeechManager.h>
#include <iostream>

using namespace Aws::Polly;
using namespace Aws::TextToSpeech;

static const char* ALLOCATION_TAG = "PollySample::Main";

int main()
{
	Aws::SDKOptions options;
	Aws::InitAPI(options);
	{
	    auto client = Aws::MakeShared<PollyClient>(ALLOCATION_TAG);
	    TextToSpeechManager manager(client);
		
		//iterate devices and select the device and capabilities you'd like to play to
		//...
		manager.SetActiveDevice(device, deviceInfo, capability);
		
		//iterate voices and select the one you wish to use.
		//...
		manager.SetActiveVoice(selectedVoice);
		
		//this is a callback for handling the result since SendTextToOutputDevice is an 
		//asynchronous operation.
        SendTextCompletedHandler handler;
		manager.SendTextToOutputDevice("Hello World", handler);
	}
	Aws::ShutdownAPI(options);
	return 0;
}

We’ve also created an Amazon Polly sample console application to demonstrate how to use this API.

Platform Support

We’ve provided default implementations for various platforms.

  • On Windows, we use the WaveForm Audio API. This should work for both desktop and mobile Windows applications.
  • For most POSIX systems, we’ve provided a PulseAudio implementation. To use this in your builds, you need to install the header files for PulseAudio. Also be sure your deployment targets have a Pulse server installed and configured. The development packages can most likely be installed via apt-get install libpulse-dev or yum install pulseaudio-libs-devel.
  • On Apple platforms, we’ve integrated with the Core Audio frameworks. This works out of the box for OSX and iOS devices.

Of course, we’ve also provided a way for you to use your own audio driver implementations. All you need to do is pass your own implementation of Aws::TextToSpeech::PCMOutputDriverFactory to the constructor for Aws::TextToSpeech::TextToSpeechManager.

We’re really excited to see what kinds of innovative applications our users will apply this to. Currently, we’ve only provided the capability to use raw audio. Depending on the use cases we see customers implement, we’ll likely go back and add MP3 and OGG Vorbis support. Please let us know how you’re using this text-to-speech library now and how you would like to use it!