Tag: Visual Studio


General Availability of the AWS Toolkit for Visual Studio 2017

We’re pleased to announce that the AWS Toolkit for Visual Studio is now generally available (GA) for Visual Studio 2017. You can install it from the Visual Studio Gallery. The GA version is 1.12.1.0.

Unlike with previous versions of the toolkit, we are using the Visual Studio Gallery to distribute new versions that target Visual Studio 2017 editions and those that follow. New features in the Visual Studio installation experience enable setup experiences that we previously required a Windows Installer to provide. No more! Now you’re able to use the familiar Extensions and Updates dialog box in the IDE to obtain updates.

For users who require the toolkit in earlier versions of Visual Studio (2013 and 2015), or the awsdeploy.exe standalone deployment tool, we’ll continue to maintain our current Windows Installer. In addition to the toolkit and deployment tool, this installer contains the .NET 3.5 and .NET 4.5 AWS SDK for .NET assemblies. It also contains the AWS Tools for Windows PowerShell module. If you have PowerShell version 5, you can also install the AWSPowerShell module from the PowerShell Gallery.

Thanks to everyone who tried out the preview versions of the AWS Toolkit for Visual Studio 2017 and for providing feedback!

Using AWS CodeCommit with Visual Studio Team Explorer

We recently announced support for new features in the AWS Toolkit for Visual Studio that make working with AWS CodeCommit repositories easy and convenient from within Visual Studio Team Explorer. In this post, we take a look at getting started with setting up credentials, and then how to create and clone repositories from within Team Explorer.

Credential types for AWS CodeCommit

If you’re an existing user of the AWS Toolkit for Visual Studio, you’re aware of setting up AWS credential profiles that contain your access and secret keys. These credential profiles are used in the Toolkit for Visual Studio to enable the Toolkit to call service APIs on your behalf, for example, to list your Amazon S3 buckets in AWS Explorer or to launch an Amazon EC2 instance. The integration of AWS CodeCommit with Team Explorer also uses these credential profiles. However, to work with Git itself we need additional credentials, specifically, Git credentials for HTTPS connections. You can read about these kinds of credentials (a user name and password) at Setup for HTTPS Users Using Git Credentials in the AWS CodeCommit user guide.

You can create the Git credentials for AWS CodeCommit only for Identity and Access Management (IAM) user accounts. You cannot create them for a root account. You can create up to two sets of these credentials for the service and, although you can mark a set of credentials as inactive, inactive sets still count toward your limit of two sets. Note that you can delete and recreate credentials at any time. When you use AWS CodeCommit from within Visual Studio, your traditional AWS credentials are used for working with the service itself, for example, when you’re creating and listing repositories. When working with the actual Git repositories hosted in AWS CodeCommit, you use the Git credentials.

As part of the support for AWS CodeCommit, we’ve extended the Toolkit for Visual Studio to automatically create and manage these Git credentials for you and associate them with your AWS credential profile. That way, you don’t need to worry about having the right set of credentials at hand to perform Git operations within Team Explorer. Once you connect to Team Explorer with your AWS credential profile, the associated Git credentials are used automatically whenever you work with a Git remote.

Later in this post we’ll go over how and when to set up the Git credentials that you need. Just remember that you have to use an IAM user account (which we strongly recommend you do anyway).

Connecting to AWS CodeCommit

When you open the Team Explorer window in Visual Studio 2015 or later, you’ll see a new entry in the Hosted Service Providers section of Manage Connections, as shown.

Choosing Sign up opens the AWS home page in a browser window. What happens when you choose Connect depends on whether the Toolkit for Visual Studio can find a credential profile with AWS access and secret keys to enable it to make calls to AWS on your behalf. You might have set up a credential profile by using the new Getting Started page that displays in the IDE when the Toolkit cannot find any locally stored credentials. Or you might have been using our Toolkit, the AWS Tools for PowerShell, or the AWS CLI and already have AWS credential profiles available for the Toolkit to use.

When you choose Connect, the toolkit starts the process to find a credential profile to use in the connection. If the Toolkit can’t find a credential profile, it opens a dialog box that invites you to enter the access and secret keys for your AWS account. We strongly recommend that you use an IAM user account, and not your root credentials. In addition, as noted earlier, the Git credentials you will eventually need can only be created for IAM users. Once the access and secret keys are provided and the credential profile is created, the connection between Team Explorer and AWS CodeCommit is ready for use.

If the Toolkit finds more than one AWS credential profile, you’re prompted to select the account you want to use within Team Explorer, as shown.

If you have only one credential profile, the toolkit bypasses the profile selection dialog box and you’re connected immediately.

When a connection is established between Team Explorer and AWS CodeCommit via your credential profiles, the invitation dialog box closes and the connection panel is displayed, as shown below.

Because we have no repositories cloned locally, the panel shows just the operations we can perform: Clone, Create, and Sign out. Like other providers, AWS CodeCommit in Team Explorer can be bound to only a single AWS credential profile at any given time. To switch accounts, you use Sign out to remove the connection so you can start a new connection using a different account. We’ll see how this panel expands to display our local AWS CodeCommit repositories later in the post.

Now that we have established a connection, we can create a repository by clicking the Create link.

Creating a repository

When we click the Create link, the Create a New AWS CodeCommit Repository dialog box opens.

AWS CodeCommit repositories are organized by region, so in Region we can select the region in which to host the repository. The list has all the regions in which AWS CodeCommit is supported. We provide the Name (required) and Description (optional) for our new repository.

The default behavior of the dialog box is to suffix the folder location for the new repository with the repository name (as you enter the name, the folder location also updates). To use a different folder name, edit the Clone into folder path after you finish entering the repository name.

You can also elect to automatically create an initial .gitignore file for the repository. The AWS Toolkit for Visual Studio provides a built-in default for Visual Studio file types. Or you can choose to have no file or to use a custom existing file that you would like to reuse across repositories. Simply select Use custom in the list and navigate to the custom file to use.

Once we have a repository name and location, we’re ready to click OK and start creating the repository. The Toolkit requests that the service create the repository and then clone the new repository locally, adding an initial commit for the .gitignore file, if we’re using one. It’s at this point that we start working with the Git remote, so the Toolkit now needs access to the Git credentials we described earlier.

Setting up Git credentials

Until now we’ve been using AWS access and secret keys to request that the service create our repository. Now we need to work with Git itself to do the actual clone operation, and Git doesn’t understand AWS access and secret keys. Instead, we need to supply the user name and password credentials to Git to use on an HTTPS connection with the remote.

As we said earlier, the Git credentials we’re going to use must be associated with an IAM user. You cannot generate them for root AWS credentials (this is another reason why we recommend you set up your AWS credential profiles to contain IAM user access and secret keys, and not root keys). The Toolkit can attempt to set up Git credentials for AWS CodeCommit for you, and associate them with the AWS credential profile that we used to connect in Team Explorer earlier. Let’s take a look at the process.

When you choose OK in the Create a New AWS CodeCommit Repository dialog box and successfully create the repository, the Toolkit checks the AWS credential profile that is connected in Team Explorer to determine if Git credentials for AWS CodeCommit exist and are associated locally with the profile. If so, the Toolkit instructs Team Explorer to commence the clone operation on the new repository. If Git credentials are not available locally, the Toolkit checks the type of account credentials that were used in the connection in Team Explorer. If the credentials are for an IAM user, as we recommend, the following message is shown.

If the credentials are root credentials, the following message is shown instead.

In both cases, the Toolkit offers to attempt to do the work to create the necessary Git credentials for you. In the first scenario, all it needs to create are a set of Git credentials for the IAM user. When a root account is in use, the Toolkit first attempts to create an IAM user and then proceeds to create Git credentials for that new user. If the Toolkit has to create a new user, it applies the AWS CodeCommit Power User managed policy to that new user account. This policy allows access to AWS CodeCommit (and nothing else) and enables all operations to be performed with AWS CodeCommit except for repository deletion.

When you’re creating credentials, you can only view them once. Therefore, the toolkit prompts you to save the newly created credentials (as a .csv file) before continuing.

You won’t be surprised to learn that this is something we also strongly recommend (and be sure to save them to a secure location)!

There might be cases where the Toolkit can’t automatically create credentials. For example, you may already have created the maximum number of sets of Git credentials for AWS CodeCommit (two), or you might not have sufficient programmatic rights for the Toolkit to do the work for you (if you’re signed in as an IAM user). In these cases, you can log into the AWS Management Console to manage the credentials or obtain them from your administrator. You can then enter them in the Git Credentials for AWS CodeCommit dialog box, which the Toolkit displays.

Now that the credentials for Git are available, the clone operation for the new repository proceeds (see progress indication for the operation inside Team Explorer). If you elected to have a default .gitignore file applied, it is committed to the repository with a comment of ‘Initial Commit’.

That’s all there is to setting up credentials and creating a repository within Team Explorer. Once the required credentials are in place, all you see when creating new repositories in the future is the Create a New AWS CodeCommit Repository dialog itself. Now let’s look at cloning an existing repository.

Cloning a repository

To clone a repository, we return to the connection panel for AWS CodeCommit in Team Explorer. We click the Clone link to open the Clone AWS CodeCommit Repository dialog box, and then select the repository to clone and the location on disk where we want to place it.

Once we choose the region, the Toolkit queries the service to discover the repositories that are available in that region and displays them in the central list portion of the dialog box. The name and optional description of each repository are also displayed. You can reorder the list to sort it by either repository name or the last modified date, and to sort each in ascending or descending order.

Once we select our repository we can choose the location to clone to. This defaults to the same repository location used in other plugins to Team Explorer, but you can browse to or enter any other location. By default, the repository name is suffixed onto the selected path. However, if you want a specific path, simply edit the text box after you select the folder. Whatever text is in the box when you click OK will be the folder in which the cloned repository will be found.

Having selected the repository and a folder location, we then click OK to proceed with the clone operation. Just as with creating a repository, you can see the progress of the clone operation reported in Team Explorer.

Working with repositories

When you clone and/or create repositories, notice that the set of local repositories for the connection are listed in the connection panel in Team Explorer under the operation links. These entries give you a convenient way to access the repository to browse content. Simply right-click the repository and choose Browse in Console.

You can also use Update Git Credentials to update the stored Git credentials associated with the credential profile. This is useful if you’ve rotated the credentials. The command will display the Git Credentials for AWS CodeCommit dialog box we noted earlier for you to enter or import the new credentials.

Git operations on the repositories work as you’d expect. You can make local commits and, when you are ready to share, you use the Sync option in Team Explorer. Because the Git credentials are already stored locally and associated with our connected AWS credential profile, we won’t be prompted to supply them again for operations against the AWS CodeCommit remote.

Wrap

We hope you found this post useful in detailing how to manage credentials for AWS CodeCommit inside Team Explorer and using them to create and clone repositories within the IDE!

Code Analyzers Added to AWS SDK for .NET

One of the most exciting Microsoft Visual Studio 2015 features is the ability to have static analysis run on your code as you write it. This allows you to flag code that is syntactically correct but will cause errors when run.

We have added static analyzers to the latest AWS SDK NuGet packages for each of the version 3 service packages. The analyzers will check the values set on the SDK classes to make sure they are valid. For example, for a property that takes in a string, the analyzer will verify the string meets the minimum and maximum length. An analyzer will also run a regular expression to make sure it meets the right pattern.

Let’s say I wanted to create an Amazon DynamoDB table. Table names must be at least three characters and cannot contain characters like @ or #. So if I tried to create a table with the name of "@work", the service would fail the request. The analyzers will detect the issue, display an alert in the code editor, and put warnings in the error list before I even attempt to call the service.

Setup

The analyzers are set up in your project when you add the NuGet reference. To see the installed analyzers, go to the project properties, choose Code Analysis, and then choose the Open button.

The code analyzers can also be disabled here.

Feedback

We hope this is just the start of what we can do with the code analysis features in Visual Studio. If can suggest other common pitfalls that can be avoided through the use of these analyzers, let us know. If you have other ideas or feedback, open an issue in our GitHub repository.

AWS Lambda Support in Visual Studio

Today we released version 1.9.0 of the AWS Toolkit for Visual Studio with support for AWS Lambda. AWS Lambda is a new compute service in preview that runs your code in response to events and automatically manages the compute resources for you, making it easy to build applications that respond quickly to new information.

Lambda functions are written in Node.js. To help Visual Studio developers, we have integrated with the Node.js Tools for Visual Studio plugin, which you can download here. Once the Node.js plugin and the latest AWS Toolkit are installed, it is easy to develop and debug locally and then deploy to AWS Lambda when you are ready. Let’s walk through the process of developing and deploying a Lambda function.

Setting up the project

To get started, we need to create a new project. There is a new AWS Lambda project template in the Visual Studio New Project dialog.

The Lambda project wizard has three ways to get started. The first option is to create a simple project that just contains the bare necessities to get started developing and testing. The second option allows you to pull down the source of a function that was already deployed. The last option allows you to create a project from a sample. For this walkthrough, select the the "Thumbnail Creator" sample and choose Finish.

Once this function is deployed, it will get called when images are uploaded to an S3 bucket. The function will then resize the image into a thumbnail, and will upload the thumbnail to another bucket. The destination bucket for the thumbnail will be the same name as the bucket containing the original image plus a "-thumbnails" suffix.

The project will be set up containing three files and the dependent Node.js packages. This sample also has a dependency on the ImageMagick CLI, which you can download from http://www.imagemagick.org/. Lambda has ImageMagick pre-configured on the compute instances that will be running the Lambda function.

Let’s take a look at the files added to the project.

app.js Defines the function that Lambda will invoke when it receives events.
_sampleEvent.json An example of what an event coming from S3 looks like.
_testdriver.js Utility code for executing the Lambda function locally. It will read in the _sampleEvent.json file and pass it into the Lambda function defined in app.js

Credentials

To access AWS resources from Lamdba, functions use the AWS SDK for Node.js which has a different path for finding credentials than the AWS SDK for .NET. The AWS SDK for Node.js looks for credentials in the environment variables AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY or through the shared credentials file. For further information about configuring the AWS SDK for Node.js refer to the AWS SDK for Node.js documentation

Running locally

To run this sample, you will need to create the source and target S3 buckets. Pick a bucket name for the source bucket, and then create the bucket using AWS Explorer. Create a second bucket with the same name as the source bucket but with the "-thumbnails" suffix. For example, you could have a pair of buckets called foobar and foobar-thumbnails. Note: the _testdriver.js defaults the region to us-west-2, so be sure to update this to whatever region you create the buckets in. Once the buckets are created, upload an image to the source bucket so that you have an image to test with.

Open the _sampleEvent.js file and update the bucket name property to the source bucket and the object key property to the image that was uploaded.

Now, you can run and debug this like any other Visual Studio project. Go ahead and open up _testdriver.js and set a breakpoint and press F5 to launch the debugger.

Deploying the function to AWS Lambda

Once we have verified the function works correctly locally, it is time to deploy it. To do that, right-click on the project and select Upload to AWS Lambda….

This opens the Upload Lambda Function dialog.

You need to enter a Function Name to identify the function. You can leave the File Name and Handler fields at the default, which indicates what function to call on behalf of the event. You then need to configure an IAM role that Lambda can use to invoke your function. For this walkthrough, you are going to create a new role by selecting that we need Amazon S3 access and Amazon CloudWatch access. It is very useful to give access to CloudWatch so that Lambda can write debugging information to Amazon CloudWatch Logs and give you monitoring on the usage of the function. You can always refine these permissions after the function is uploaded. Once all that is set, go ahead and choose OK.

Once the upload is complete the Lambda Function status view will be displayed. The last step is to tell Amazon S3 to send events to your Lambda function. To do that, click the Add button for adding an event source.

Leave the Source Type set to Amazon S3 and select the Source bucket. S3 will need permission to send events to Lambda. This is done by assigning a role to the event source. By default, the dialog will create a role that gives S3 permission. Event sources to S3 are unique in that the configuration is actually done to the S3 bucket’s notification configuration. When you choose OK on this dialog, the event source will not show up here, but you can view it by right-clicking on the bucket and selecting properties.

 

Now that the function is deployed and S3 is configured to send events to our function, you can test it by uploading an image to the source bucket. Very shortly after uploading an image to the source bucket, your thumbnail will show up in the thumbnails bucket.

 

Calling from S3 Browser

Your function is set up to create thumbnails for any newly uploaded images. But what if you want to run our Lambda function on images that have already been uploaded? You can do that by opening the S3 bucket from AWS Explorer and navigating to the image you need the Lambda function to run against and choosing Invoke Lambda Function.

Next select the function we want to invoke and choose OK. The toolkit will then create the event object that S3 would have sent to Lambda and then calls Invoke on the function.

This can be done for an individual file or by selecting multiple files or folders in the S3 Browser. This is helpful when you make a code change to your Lambda function and you want to reprocess all the objects in your bucket with the new code.

Conclusion

Creating thumbnails is just one example you can use AWS Lambda for, but I’m sure you can imagine many ways you can use the power of Lambda’s event-based compute power. Currently, you can create event sources to Amazon S3, Amazon Kinesis, and Amazon DynamoDB Streams, which is currently in preview. It is also possible to invoke Lambda functions for your own custom events using any of AWS SDKs.

Try out the new Lambda features in the toolkit and let us know what you think. Given that AWS Lambda is in preview, we would love to get your feedback about these new features and what else we can add to make you successful using Lambda.

New AWS Elastic Beanstalk Deployment Wizard

Today, we released version 1.8 of the AWS Toolkit for Visual Studio. For this release, we revamped our wizard to deploy your ASP.NET Applications. Our goal was to make deployment easier as well as take advantage of some of the new features AWS Elastic Beanstalk has added.

What happened to the AWS CloudFormation deployments?

Unlike the new deployment wizard, the previous wizard had the option to deploy using the Load Balanced and Single Instance Templates, which would deploy using AWS CloudFormation templates. This deployment option was added before we had Elastic Beanstalk, which has since added features that make these deployment templates obsolete. If you still need access to this deployment mechanism, on the first page of the new wizard you can choose to relaunch the legacy wizard.

So what’s new?

Rolling deployments

If you are deploying your applications to a load balanced environment, you can configure how new versions of your applications are deployed to the instances in your environment. You can also configure how changes to your environment are made. For example, if you have 4 instances in your environment and you want to change the instance type, you can configure the environment to change 2 instances at a time keeping your application up and running while the change is being made.

AWS Identity and Access Management roles

AWS Identity and Access Management roles are an important way of getting AWS credentials to your deployed application. With the new wizard, you can select an existing role or choose to create a new role based on a number of role templates. It is easy in the new wizard to set up a new role that gives access to Amazon S3 and DynamoDB. After deployment, you can refine the role from the AWS Explorer.

Application options

The application options page has several new features. You can now choose which build configuration to use. You can also set any application settings you want to be pushed into the web.config appSettings section when the application is being deployed.

In the previous deployment wizard, applications were deployed to a sub-folder in IIS based on the project name with the suffix "_deploy". It appeared as if it was deployed at the root because URL rewrite rules were added to the root. This worked for most cases, but there are some edge cases where this caused problems. With the new wizard, applications can be configured to deploy at any folder and by default it will be deployed at the root folder of IIS. If the application is deployed to anywhere other then the root, the URL rewrite rules are added to the root.

Feedback

We hope that you like the new wizard and that it makes things easier for you. For a full walk through of the new wizard check out the user guide for the AWS Toolkit for Visual Studio. We would love to hear your feedback on the new wizard. We would also love to hear about any interesting deployment issues you have and where you would like help from AWS .NET tooling.

 

AWS Toolkit support for Visual Studio Community 2013

We often hear from our customers that they would like our AWS Toolkit for Visual Studio to work with the Express editions of Visual Studio. We understand how desirable this is, but due to restrictions built into the Express editions of Visual Studio, it hasn’t been possible…until now.

With the recent announcement of the new Visual Studio Community 2013 edition, it is now possible to get the full functionality of our AWS Toolkit for Visual Studio inside a free edition of Visual Studio. This includes the AWS Explorer for managing resources, Web Application deployment from the Solution Explorer, and the AWS CloudFormation editor for authoring and deploying your CloudFormation templates.

So if you haven’t tried the AWS Toolkit for Visual Studio, now is a great time to check it out.

New Sample Simple Workflow

When you install the SDK from our website, many samples are installed inside Visual Studio, including the Express editions of Visual Studio. Look in the New Project Wizard, where you’ll find samples showing off many of the AWS services.

 

We recently added a new sample that shows off using Amazon Simple Workflow Service (SWF) with the .NET SDK. The sample is under AWS -> App Services section and is called AWS Simple Workflow Image Processing Sample. The sample shows how to use SWF to monitor images coming from S3 and to generate thumbnails of various sizes. In a real-world scenario, this would most likely be done with multiple processes monitoring SWF for decision and activity tasks. This sample is set up as WPF app hosting virtual consoles, each representing an individual process to make it easier to run the sample.

 

The virtual console on the top is the process that chooses an image to generate thumbnails for and starts the workflow execution.

// Snippet from StartWorkflowExecutionProcessor.cs that starts the workflow execution

swfClient.StartWorkflowExecution(new StartWorkflowExecutionRequest
{
    // Serialize input to a string
    Input = Utils.SerializeToJSON(input),
    //Unique identifier for the execution
    WorkflowId = DateTime.Now.Ticks.ToString(),
    Domain = Constants.ImageProcessingDomain,
    WorkflowType = new WorkflowType
    {
        Name = Constants.ImageProcessingWorkflow,
        Version = Constants.ImageProcessingWorkflowVersion
    }
});

 

The virtual console in the bottom left monitors SWF for decision tasks. When it gets a decision task, it looks at the workflow’s history and sees what activities have been completed to figure out which thumbnail hasn’t be created yet. If one of the thumbnail sizes hasn’t been created yet, it schedules an activity to create the next thumbnail sizes. If all the thumbnails have been created, it completes the workflow.

// Snippet from ImageProcessWorkflow.cs that polls for decision tasks and decides what decisions to make.

void PollAndDecide()
{
    this._console.WriteLine("Image Process Workflow Started");
    while (!_cancellationToken.IsCancellationRequested)
    {
        DecisionTask task = Poll();
        if (!string.IsNullOrEmpty(task.TaskToken))
        {
            // Create the next set of decisions based on the current state and
            // the execution history
            List decisions = Decide(task);

            // Complete the task with the new set of decisions
            CompleteTask(task.TaskToken, decisions);
        }
    }
}

 

The virtual console in the bottom right monitors SWF for activity tasks to perform. The activity task will have input from the decider process that tells what image to create a thumbnail for and what size of thumbnail.

// Snippet from ImageActivityWorker.cs showing the main loop for the worker that polls for tasks and processes them.

void PollAndProcessTasks()
{
    this._console.WriteLine("Image Activity Worker Started");
    while (!_cancellationToken.IsCancellationRequested)
    {
        ActivityTask task = Poll();
        if (!string.IsNullOrEmpty(task.TaskToken))
        {
            ActivityState activityState = ProcessTask(task.Input);
            CompleteTask(task.TaskToken, activityState);
        }
    }
}

 

Resource Condition Support in the AWS CloudFormation Editor

AWS CloudFormation recently added support for conditions that control whether resources are created or what value to set for properties on resources. The CloudFormation editor included with the AWS Toolkit for Visual Studio was updated to support conditions in version 1.6.1. If you have never used the CloudFormation editor, we have a screencast that gives a quick introduction to the editor.

Defining Conditions

To get started with conditions, you first need to define them.

In this example, there are 2 conditions defined. The first condition checks to see if the deployment will be a production deployment. The second condition checks to see if a new security group should be created.

Using Conditions to Control Resource Creation

For all resources defined in a template, you can set the Condition property. If the condition evaluates to true, then the resource is created with the CloudFormation stack that is the instantiation of the CloudFormation template.

This security group is created only if the CreateSecurityGroup condition evaluates to true, which occurs if no security group is passed in to the ExistingSecurityGroup parameter.

Using Conditions to Control Resource Properties

You can also use conditions to determine what value to set for a resource property.

Since the security group is going to be either created or set by the ExistingSecurityGroup parameter, the SecurityGroups property needs to have its value set conditionally depending on how the security group was created. Also, in this example, we are going to control the size of the EC2 instance depending on the deployment being a production deployment or not.

For more information about using conditions with CloudFormation, check out the AWS CloudFormation User Guide.