AWS Blog

New Utility – Opt-in to Longer Resource IDs Across All Regions

by Jeff Barr | on | in Amazon EC2, Amazon Elastic Block Store | | Comments

Early this year I announced that Longer EC2 Resource IDs are Now Available, and kicked off the start of a transition period that will last until early December 2016. During the transition period, you can opt in to the new resource format on a region-by-region, user-by-user basis. At the conclusion of the transition period, all newly created resources will be assigned 17-character identifiers. Here are some important dates for your calendar:

  • November – Beginning on November 1st, you can use the describe-id-format command to check on the cutover deadline for the regions that are of interest to you.
  • December – Between December 5th and December 16th, we will be setting individual AWS Regions to use 17-character identifiers by default.

In order to help you to ensure that your code and your tools can handle the new format, I’d like to personally encourage you to opt in as soon as possible!

We’ve launched a new longer-ID-converter tool that will allow you to opt in, opt out, or simply check the status. If you have already installed the AWS Command Line Interface (CLI), you can simply download, the script, make it executable, and then run it:

$ wget
$ chmod +x

Here are some of the things that you can do.

Check the status of your account:

$ ./ --status

Convert account, IAM Roles, and IAM Users to long IDs:

$ ./

Revert to short IDs:

$ ./ --revert

Convert the current User/Role:

$ ./ --convertself

For more information on this utility, check out the README file. For more information on the move to longer resource IDs, consult the EC2 FAQ.


New – AWS Server Migration Service

by Jeff Barr | on | in Launch | | Comments

I love to use the historical photo at right to emphasize a situation that many of our customers face. They need to move their existing IT infrastructure to the AWS Cloud without scheduling prolonged maintenance periods for data migration. Because many of these applications are mission-critical and are heavily data-driven, taking systems offline in order to move gigabytes or terabytes of stored data is simply not practical.

New Service
Today I would like to tell you about AWS Server Migration Service.

This service simplifies and streamlines the process of migrating existing virtualized applications to Amazon EC2. In order to support the IT equivalent of the use case illustrated in the photo, it allows you to incrementally replicate live Virtual Machines (VMs) to the cloud without the need for a prolonged maintenance period. You can automate, schedule, and track incremental replication of your live server volumes, simplifying the process of coordinating and implementing large-scale migrations that span tens or hundreds of volumes.

You get full control of the replication process, from the AWS Management Console, AWS Command Line Interface (CLI), and through a set of migration APIs. After choosing the Windows or Linux servers to migrate, you can choose the replication frequency that best matches your application’s usage pattern and minimizes network bandwidth. Behind the scenes, AWS Server Migration Service will replicate your server’s volumes to the cloud, creating a new Amazon Machine Image (AMI) for each one. You can track the status of each replication job from the console.  Each incremental sync generates a fresh AMI, allowing you to test the migrated volumes in advance of your actual cut-over.

Migration Service Tour
Before you start the actual migration process, you need to download and deploy the AWS Server Migration Service Connector. The Connector runs within your existing virtualized environment, and allows the migration itself to be done in agentless fashion, sparing you the trouble of installing an agent on each existing server. If you run a large organization and/or have multiple virtualized environments, you can deploy multiple copies of the Connector.

The Connector has a web UI that you’ll access from within your existing environment. After you click through the license agreement, you will be prompted to create a password, configure the local network settings, and finalize a couple of preferences. Next, you will need to provide the Connector with a set of AWS account or IAM User credentials so that it can access the SMS, S3, and SNS APIs. If you use an IAM User, you’ll also need to create an appropriate IAM Role (the User Guide contains a sample).

With the Connector up and running, you can log in to the AWS Management Console, navigate to Server Management Service, and see a list of all of the Connectors that have registered with the service. From there you can import the server catalog from each Connector and inspect the Server inventory:

Then you can pick some servers to replicate, select them, and click on Create replication jobs. Next, you configure the license type (AWS or Bring Your Own) for server:

With that out of the way, you can choose to initiate replication immediately or at a date and time in the future. You can also choose the replication interval:

After you review and approve the settings, you can view all of your replication jobs in the dashboard:

You can also examine individual jobs:

And you can see the AMIs created after each incremental run:

From there you can click on Launch instance, choose an EC2 instance type, and perform acceptance testing on the migrated server.

Available Now
AWS Server Migration Service is now available in the US East (Northern Virginia), EU (Ireland), and Asia Pacific (Sydney) Regions and you can start using it today. There is no charge for the use of the service; you pay for S3 storage used during the replication process and for the EBS snapshots created when the migration is complete.



First AWS Certification Study Guide Now Available

by Jeff Barr | on | in Training and Certification | | Comments

My colleague Joe Baron wrote the guest post below to introduce you to a book that he and his colleagues have put together!


Are you studying for the AWS Certified Solutions Architect – Associate exam?

The new AWS Certified Solutions Architect Official Study Guide: Associate Exam has just been published by John Wiley & Sons, Inc., and is now available on in both paperback and Kindle format. The 455-page book helps prepare candidates for the AWS Certified Solutions Architect – Associate Exam. It was written by a very experienced team of subject matter experts, all part of the team that wrote, reviewed, and developed the AWS Certified Solutions Architect – Associate exam. The study guide includes an introduction to AWS, chapters on core AWS services, as well as information on AWS security, compliance, and architectural best practices. Each chapter includes targeted information on the topic, as well as key exam essentials, exercises, and chapter review questions (with answers in the appendix). The guide also gives you access to SYBEX online study tools such as practice exams, flashcards, chapter tests and assessment tests.

In addition to the new book, we have a half-day workshop to help you prepare for the exam. In the AWS Certification Exam Readiness Workshop: AWS Certified Solutions Architect – Associate, we review what to expect at the testing center and while taking the exam. We walk you through how the exam is structured, as well as teach you how to interpret the concepts being tested so that you can better eliminate incorrect responses.  You will also have the chance to test concepts we cover through a series of practice exam questions.  At the end of the class, you will receive a voucher to take an online practice exam at no cost.

If you will be attending AWS re:Invent this year, you can purchase a study guide now so that you can prepare to take the Solutions Architect – Associate exam on-site (reserve your seat now).

Joe Baron, Principal Solutions Architect

AWS Week in Review – October 17, 2016

by Jeff Barr | on | in Week in Review | | Comments

Wow, a lot is happening in AWS-land these days! Today’s post included submissions from several dozen internal and external contributors, along with material from my RSS feeds, my inbox, and other things that come my way. To join in the fun, create (or find) some awesome AWS-related content and submit a pull request!


October 17


October 18


October 19


October 20


October 21


October 22


October 23

New Customer Success Stories

  • 20 Minutos – 20minutos is a free newspaper and website that has rapidly become one of the main sources of information in Spain. The company is using Amazon S3, Amazon RDS, and Amazon Glacier to store the fast growing content generated through the website or social media, and uses Amazon EC2 and Amazon ElastiCache to guarantee the scalability of their systems.
  • Innovantage -Migrating to AWS has enabled InnoVantage’s developers to spend only five percent of their time on infrastructure and has reduced the company’s time to market from 14 months to seven weeks. InnoVantage provides cloud-based applications to large businesses and government organizations. InnoVantage uses AWS Elastic Beanstalk, AWS CloudFormation, and AWS Lambda to push its Cogito product to a defined set of AWS infrastructure services, create and manage AWS resources, and run code in response to events.
  • Vango – Using software from the AWS Marketplace, Vango can now build its digital-media products faster to generate more revenue. Located in San Francisco, Vango is an online marketplace that helps art lovers find and purchase art online. Vango uses Imagizer Media Engine for AWS to efficiently manipulate and manage images, support new interfaces in its applications, build images for its branding and digital assets, and prototype new designs.
  • Vidyard -Vidyard uses AWS to ensure seamless video upload and playback capabilities, deliver 30 percent faster video upload times, and give its customers confidence in the security of their data. Based in Canada, the organization provides a video-hosting platform that shows customers specifically how and when viewers watch their videos. Vidyard is all in with AWS, running its complete video platform on the AWS Cloud.

New & Notable Open Source

New SlideShare Presentations

Upcoming Events

New AWS Marketplace Listings

  • Application Development
    • Joomia 3.6.0 + Apache + MySQL + AMAZONLINUX AMI by MIRI Infotech Inc, sold by Miri Infotech.
    • LAMP 5 MariaDB and LAMP 7 MariaDB, sold by Jetware.
    • Secured Acquia Drupal on Windows 2008 R2, sold by Cognosys Inc.
    • Secured BugNet on Windows 2008 R2, sold by Cognosys Inc.
    • Secured CMS Gallery on Windows 2008 R2, sold by Cognosys Inc.
    • Secured Kooboo CMS on Windows 2008 R2, sold by Cognosys Inc.
    • Secured Lemoon on Windows 2008 R2, sold by Cognosys Inc.
    • Secured Magento on Windows 2008 R2, sold by Cognosys Inc.
    • Secured MyCV on Windows 2012 R2<, sold by Cognosys Inc.
    • Secured nService on Windows 2012 R2, sold by Cognosys Inc.
    • Secured Orchard CMS on Windows 2008 R2, sold by Cognosys Inc.
  • Application Servers
    • Microsoft Dynamics NAV 2016 for Business, sold by Data Resolution.
    • Microsoft Dynamics GP 2015 for Business, sold by Data Resolution.
    • Microsoft Dynamics AX 2012 for Business, sold by Data Resolution.
    • Microsoft Dynamics SL 2015 for Business, sold by Data Resolution.
    • Redis 3.0, sold by Jetware.
  • Application Stacks
    • LAMP 5 Percona and LAMP 7 Percona, sold by Jetware.
    • MySQL 5.1, MySQL 5.6, and MySQL 5.7, sold by Jetware.
    • Percona Server for MySQL 5.7, sold by Jetware.
    • Perfect7 LAMP v1.1 Multi-PHP w/Security (HVM), sold by Archisoft.
    • Perfect7 LAMP v1.1 Multi-PHP Base (HVM), sold by Archisoft.
  • Content Management
    • DNN Platform 9 Sandbox – SQL 2016, IIS 8.5, .Net 4.6, W2K12R2, sold by Benjamin Hermann.
    • iBase 7, sold by iBase.
    • MediaWiki powered by Symetricore (Plus Edition), sold by Symetricore.
    • Secured CompositeC1 on Windows 2008 R2, sold by Cognosys Inc.
    • Secured Dot Net CMS on Windows 2008 R2, sold by Cognosys Inc.
    • Secured Gallery Server on Windows 2008 R2,sold by Cognosys Inc.
    • Secured Joomia on Windows 2008 R2, sold by Cognosys Inc.
    • Secured Mayando on Windows 2008 R2, sold by Cognosys Inc.
    • Secured phpBB on Windows 2008 R2, sold by Cognosys Inc.
    • Secured Wiki on Windows 2008 R2, sold by Cognosys Inc.
    • SharePoint 2016 Enterprise bYOL with paid support, sold by Data Resolution.
    • WordPress Powered by AMIMOTO (Auto-Scaling ready), sold by DigitalCube Co. Ltd.
  • Databases
    • MariaDB 5.5, 10.0, and 10.1, sold by Jetware.
    • Redis 3.2, sold by Jetware
  • Databases
    • MariaDB 5.5, 10.0, and 10.1, sold by Jetware.
    • Redis 3.2, sold by Jetware.
  • eCommerce
    • Secured AspxCommerce on Windows 2008 R2, sold by Cognosys Inc.
    • Secured BeYourMarket on Windows 2008 R2, sold by Cognosys Inc.
    • Secured DashComerce on Windows 2008 R2, sold by Cognosys Inc.
    • Vikrio, sold by Vikrio.
  • Issue & Bug Tracking
    • Redmine 2.6 and Redmine 3.3, sold by Jetware.
  • Monitoring
    • Memcached 1.4, sold by Jetware
  • Network Infrastructure
    • 500 Mbps Load Balancer with Commercial WAF Subscription, sold by KEMP Technologies.
  • Operating System
    • Ubuntu Desktop 16.04 LTS (HVM), sold by Netspectrum Inc.
  • Security
    • AlienVault USM (Unified Security Management) Anywhere, sold by AlienVault.
    • Armor Anywhere CORE, sold by Armor Defense.
    • Hillstone CloudEdge Virtual-Firewall Advanced Edition, sold by Hillstone Networks.
    • Negative SEO Monitoring, sold by SEO Defend.

Help Wanted

Stay tuned for next week! In the meantime, follow me on Twitter and subscribe to the RSS feed.

Congratulations to the Winners of the Serverless Chatbot Competition!

by Jeff Barr | on | in Amazon API Gateway, AWS Lambda, Developers | | Comments

I announced the AWS Serverless Chatbot Competion in August and invited you to build a chatbot for Slack using AWS Lambda and Amazon API Gateway.

Last week I sat down with fellow judges Tim Wagner (General Manager of AWS Lambda) and Cecilia Deng (a Software Development Engineer on Tim’s team) to watch the videos and to evaluate all 62 submissions. We were impressed by the functionality and diversity of the entrees, as well as the efforts that the entrants put in to producing attractive videos to show their submissions in action.

After hours of intense deliberation we chose a total of 9 winners: 8 from individuals, teams & small organizations and one from a larger organization. Without further ado, here you go:

Individuals, Teams, and Small Organizations
Here are the winners of the Serverless Slackbot Hero Award. Each winner receives one ticket to AWS re:Invent, access to discounted hotel room rates, public announcement and promotion during the Serverless Computing keynote, some cool swag, and $100 in AWS Credits. You can find the code for many of these bots on GitHub. In alphabetical order, the winners are:

AWS Network Helper“The goal of this project is to provide an AWS network troubleshooting script that runs on a serverless architecture, and can be interacted with via Slack as a chat bot.GitHub repo.

B0pb0t – “Making Mealtime Awesome.” GitHub repo.

Borges – “Borges is a real-time translator for multilingual Slack teams.” GitHub repo.

CLIve – “CLIve makes managing your AWS EC2 instances a doddle. He understands natural language, so no need to learn a new CLI!”

Litlbot – “Litlbot is a Slack bot that enables realtime interaction with students in class, creating a more engaged classroom and learning experience.” GitHub repo.

Marbot – “Forward alerts from Amazon Web Services to your DevOps team.”

Opsidian – “Collaborate on your AWS infra from Slack using natural language.”

ServiceBot – “Communication platform between humans, machines, and enterprises.” GitHub repo.

Larger Organization
And here’s the winner of the Serverless Slackbot Large Organization Award:

Eva – “The virtual travel assistant for your team.” GitHub repo.

Thanks & Congratulations
I would like to personally thank each of the entrants for taking the time to submit their entries to the competition!

Congratulations to all of the winners; I hope to see you all at AWS re:Invent.



PS – If this list has given you an idea for a chatbot of your very own, please watch our Building Serverless Chatbots video and take advantage of our Serverless Chatbot Sample.

AWS Budgets Update – Track Cloud Costs and Usage

by Jeff Barr | on | in AWS Budgets, AWS Cost Explorer, Enterprise | | Comments

As Spider-Man and others before him have said, “with great power comes great responsibility.” In the on-demand, pay-as-you-go cloud world, this means that you need to be an informed, responsible consumer. In a corporate environment, this means that you need to pay attention to budgets and to spending, and to make sure that your actual spend is in line with your projections. With AWS in use across multiple projects and departments, tracking and forecasting becomes more involved.

Today we are making some important upgrades to the AWS Budgets feature (read New – AWS Budgets and Forecasts for background information). This feature is designed to be used by Finance Managers, Project Managers, and VP-level DevOps folks (please feel free to share this post with similarly-titled members of your organization if you are not directly responsible for your cloud budget).  You can use AWS Budgets to maintain a unified view of your costs and usage for specific categories that you define, and you can sign up for automated notifications that provide you with detailed status information (over or under budget) so that you can identify potential issues and take action to prevent undesired actual or forecasted overruns.

AWS Budgets Updates
You can create up to 20,000 budgets per payer account. In order to allow you to stay on top of your spending in environments where costs and resource consumption are changing frequently, the budgets are evaluated four times per day. Notifications are delivered via email or programmatically (an Amazon Simple Notification Service (SNS) message), so that you can take manual, semi-automated, or fully automated corrective action.  This gives you the power to address all of the following situations, along with others that may arise withing your organization:

VP – Optimize your overall cloud spend, with budgets for each business unit and for the company as a whole, tracking spending by region and other dimensions and comparing actual usage against budgets.

Project Manager – Manage costs within your department, watching multiple services, tags, and regions. Alert stakeholders when thresholds have been breached, and ask them to take action. When necessary, give resource budgets to individual team members to encourage adoption and experimentation.

Finance Manager – Analyze historical costs for your organization and use your insight into future plans to develop suitable budgets. Examine costs across the entire company, or on a per-account, per-service, business unit, or project team level.

Creating a Budget
Let’s create a budget or two!

Start by opening up Billing and Cost Management:

And then click on Budgets:

If you are new to AWS Budgets, you may have to wait up to 24 hours after clicking Create budget before you can proceed to the next step. During this time, we’ll prepare the first set of Detailed Billing Reports for your account.

Click on Create budget, decide whether you want the budget to be based on costs or usage, and give your budget a name. Then select Monthly, Quarterly, or Annual. I’ll go for a cost-based ($1000) monthly budget named MainBudget to get started:

By not checking any of the options next to Include costs related to, my budget will apply to my entire account. Checking a box opens the door to all sorts of additional options that give you a lot of flexibility. Here’s how I could create a budget for usage of EC2 instances where the Owner tag is set to jbarr:

I could be even more specific, and choose to set a very modest budget for usage that is on Non-Reserved instances. This would be a great way to make sure that I am making good use of any Reserved Instances that my organization owns.

The next step is to set up email or programmatic notifications:

The programmatic notification option can be used in many different ways. I could create a new web app with a fixed budget, and then invoke a AWS Lambda function if costs are approaching the budgeted amount. The app could take corrective action to ensure that the budget is not exceeded. For example, it could temporarily disable some of the more computationally intensive features, or it could switch over to a statically hosted alternative site.

With everything set up as desired I simply click on Create. My budget is visible right away (I clicked on the triangle in order to display the details before I took this screen shot):

As you can can see, I have already overspent my $1000 budget, with a forecast of almost $5,600 for the month. Given that we are a frugal company (read our leadership principles to learn more), I really need to see what’s going on and clean up some of my extra instances! Because I had opted for email notification, I received the following message not too long after I created my budget:

Suppose that my data transfer budget is separate from my compute budget, and that I am allowed to transfer up to 100 GB of data out of S3 every month, regardless of the cost at the time. I can create a budget that looks like this:

And I can see at a glance that I am in no danger of exceeding my data transfer budget:

I can also download the on-screen information in CSV form for further inspection or as input to another part of my budgeting process:

As you can see, this new feature gives you the power to set up very detailed budgets. Although I have introduced this feature using the AWS Management Console, you can also set up budgets by making calls to the new Budget API or by using the AWS Command Line Interface (CLI). This API includes functions like CreateBudget, DescribeBudget, and UpdateBudget that you can use from within your own applications.

Available Now
This new feature is available now and you can start using it today! You can create two budgets per account at no charge; additional budgets cost $0.02 per day (again, you can have up to 20,000 budgets per account).

To learn more, read Managing Your Costs with Budgets.



AWS Developer Tool Recap – Recent Enhancements to CodeCommit, CodePipeline, and CodeDeploy

by Jeff Barr | on | in Amazon CodeCommit, Amazon CodeDeploy, Amazon CodePipeline | | Comments

The AWS Developer Tools help you to put modern DevOps practices to work! Here’s a quick overview (read New AWS Tools for Code Management and Deployment for an in-depth look):

AWS CodeCommit is a fully-managed source code control service. You can use it to host secure and highly scalable private Git repositories while continuing to use your existing Git tools and workflows (watch the Introduction to AWS CodeCommit video to learn more).

AWS CodeDeploy automates code deployment to Amazon Elastic Compute Cloud (EC2) instances and on-premises servers. You can update your application at a rapid clip, while avoiding downtime during deployment (watch the Introduction to AWS CodeDeploy video to learn more).

AWS CodePipeline is a continuous delivery service that you can use to streamline and automate your release process. Checkins to your repo (CodeCommit or Git) will initiate build, test, and deployment actions (watch Introducing AWS CodePipeline for an introduction). The build can be deployed to your EC2 instances or on-premises servers via CodeDeploy, AWS Elastic Beanstalk, or AWS OpsWorks.

You can combine these services with your existing build and testing tools to create an end-to-end software release pipeline, all orchestrated by CodePipeline.

We have made a lot of enhancements to the Code* products this year and today seems like a good time to recap all of them for you! Many of these enhancements allow you to connect the developer tools to other parts of AWS so that you can continue to fine-tune your development process.

CodeCommit Enhancements
Here’s what’s new with CodeCommit:

  • Repository Triggers
  • Code Browsing
  • Commit History
  • Commit Visualization
  • Elastic Beanstalk Integration

Repository Triggers – You can create Repository Triggers that Send Notification or Run Code whenever a change occurs in a CodeCommit repository (these are sometimes called webhooks — user-defined HTTP callbacks). These hooks will allow you to customize and automate your development workflow. Notifications can be delivered to an Amazon Simple Notification Service (SNS) topic or can invoke a Lambda function.

Code Browsing – You can Browse Your Code in the Console. This includes navigation through the source code tree and the code:

Commit History – You can View the Commit History for your repositories (mine is kind of quiet, hence the 2015-era dates):

Commit Visualization – You can View a Graphical Representation of the Commit History for your repositories:

Elastic Beanstalk Integration – You can Use CodeCommit Repositories with Elastic Beanstalk to store your project code for deployment to an Elastic Beanstalk environment.

CodeDeploy Enhancements
Here’s what’s new with CodeDeploy:

  • CloudWatch Events Integration
  • CloudWatch Alarms and Automatic Deployment Rollback
  • Push Notifications
  • New Partner Integrations

CloudWatch Events Integration – You can Monitor and React to Deployment Changes with Amazon CloudWatch Events by configuring CloudWatch Events to stream changes in the state of your instances or deployments to an AWS Lambda function, an Amazon Kinesis stream, an Amazon Simple Queue Service (SQS) queue, or an SNS topic. You can build workflows and processes that are triggered by your changes. You could automatically terminate EC2 instances when a deployment fails or you could invoke a Lambda function that posts a message to a Slack channel.

CloudWatch Alarms and Automatic Deployment Rollback – CloudWatch Alarms give you another type of Monitoring for your Deployments. You can monitor metrics for the instances or Auto Scaling Groups managed by CodeDeploy and take action if they cross a threshold for a defined period of time, stop a deployment, or change the state of an instance by rebooting, terminating, or recovering it. You can also automatically rollback a deployment in response to a deployment failure or a CloudWatch Alarm.

Push Notifications – You can Receive Push Notifications via Amazon SNS for events related to your deployments and use them to track the state and progress of your deployment.

New Partner Integrations – Our CodeDeploy Partners have been hard at work, connecting their products to ours. Here are some of the most recent offerings:

CodePipeline Enhancements
And here’s what’s new with CodePipeline:

  • AWS OpsWorks Integration
  • Triggering of Lambda Functions
  • Manual Approval Actions
  • Information about Committed Changes
  • New Partner Integrations

AWS OpsWorks Integration – You can Choose AWS OpsWorks as a Deployment Provider in the software release pipelines that you model in CodePipeline:

You can also configure CodePipeline to use OpsWorks to deploy your code using recipes contained in custom Chef cookbooks.

Triggering of Lambda Functions – You can now Trigger a Lambda Function as one of the actions in a stage of your software release pipeline. Because Lambda allows you to write functions to perform almost any task, you can customize the way your pipeline works:

Manual Approval Actions – You can now add Manual Approval Actions to your software release pipeline. Execution pauses until the code change is approved or rejected by someone with the required IAM permission:

Information about Committed Changes – You can now View Information About Committed Changes to the code flowing through your software release pipeline:


New Partner Integrations – Our CodePipeline Partners have been hard at work, connecting their products to ours. Here are some of the most recent offerings:

New Online Content
In order to help you and your colleagues to understand the newest development methodologies, we have created some new introductory material:

Thanks for Reading!
I hope that you have enjoyed this quick look at some of the most recent additions to our development tools.

In order to help you to get some hands-on experience with continuous delivery, my colleagues have created a new Pipeline Starter Kit. The kit includes a AWS CloudFormation template that will create a VPC with two EC2 instances inside, a pair of applications (one for each EC2 instance, both deployed via CodeDeploy), and a pipeline that builds and then deploys the sample application, along with all of the necessary IAM service and instance roles.


Run Windows Server 2016 on Amazon EC2

by Jeff Barr | on | in Amazon EC2, Launch, Windows | | Comments

You can now run Windows Server 2016 on Amazon Elastic Compute Cloud (EC2). This version of Windows Server is packed with new features including support for Docker and Windows containers.  We are making it available in all AWS regions today, in four distinct forms:

  • Windows Server 2016 Datacenter with Desktop Experience – The mainstream version of Windows Server, designed with security and scalability in mind, with support for both traditional and cloud-native applications. To learn a lot more about Windows Server 2016, download The Ultimate Guide to Windows Server 2016 (registration required).
  • Windows Server 2016 Nano Server -A cloud-native, minimal install that takes up a modest amount of disk space and boots more swiftly than the Datacenter version, while leaving more system resources (memory, storage, and CPU) available to run apps and services. You can read Moving to Nano Server to learn how to migrate your code and your applications. Nano Server does not include a desktop UI so you’ll need to administer it remotely using PowerShell or WMI. To learn how to do this, read Connecting to a Windows Server 2016 Nano Server Instance.
  • Windows Server 2016 with Containers – Windows Server 2016 with Windows containers and Docker already installed.
  • Windows Server 2016 with SQL Server 2016 – Windows Server 2016 with SQL Server 2016 already installed.

Here are a couple of things to keep in mind with respect to Windows Server 2016 on EC2:

  • Memory – Microsoft recommends a minimum of 2 GiB of memory for Windows Server. Review the EC2 Instance Types to find the type that is the best fit for your application.
  • Pricing – The standard Windows EC2 Pricing applies; you can launch On-Demand and Spot Instances, and you can purchase Reserved Instances.
  • Licensing – You can (subject to your licensing terms with Microsoft) bring your own license to AWS.
  • SSM Agent – An upgraded version of our SSM Agent is now used in place of EC2Config. Read the User Guide to learn more.

Containers in Action
I launched the Windows Server 2016 with Containers AMI and logged in to it in the usual way:

Then I opened up PowerShell and ran the command docker run microsoft/sample-dotnet . Docker downloaded the image, and launched it. Here’s what I saw:

We plan to add Windows container support to Amazon ECS by the end of 2016. You can register here to learn more.

Get Started Today
You can get started with Windows Server 2016 on EC2 today. Try it out and let me know what you think!


Amazon Aurora Update – Call Lambda Functions From Stored Procedures; Load Data From S3

by Jeff Barr | on | in Amazon Aurora, Amazon S3, AWS Lambda | | Comments

Many AWS services work just fine by themselves, but even better together! This important aspect of our model allows you to select a single service, learn about it, get some experience with it, and then extend your span to other related services over time. On the other hand, opportunities to make the services work together are ever-present, and we have a number of them on our customer-driven roadmap.

Today I would like to tell you about two new features for Amazon Aurora, our MySQL-compatible relational database:

Lambda Function Invocation – The stored procedures that you create within your Amazon Aurora databases can now invoke AWS Lambda functions.

Load Data From S3 – You can now import data stored in an Amazon Simple Storage Service (S3) bucket into a table in an Amazon Aurora database.

Because both of these features involve Amazon Aurora and another AWS service, you must grant Amazon Aurora permission to access the service by creating an IAM Policy and an IAM Role, and then attaching the Role to your Amazon Aurora database cluster. To learn how to do this, see Authorizing Amazon Aurora to Access Other AWS Services On Your Behalf.

Lambda Function Integration
Relational databases use a combination of triggers and stored procedures to enable the implementation of higher-level functionality. The triggers are activated before or after some operations of interest are performed on a particular database table. For example, because Amazon Aurora is compatible with MySQL, it supports triggers on the INSERT, UPDATE, and DELETE operations. Stored procedures are scripts that can be run in response to the activation of a trigger.

You can now write stored procedures that invoke Lambda functions. This new extensibility mechanism allows you to wire your Aurora-based database to other AWS services. You can send email using Amazon Simple Email Service (SES), issue a notification using Amazon Simple Notification Service (SNS), insert publish metrics to Amazon CloudWatch, update a Amazon DynamoDB table, and more.

At the appliction level, you can implement complex ETL jobs and workflows, track and audit actions on database tables, and perform advanced performance monitoring and analysis.

Your stored procedure must call the mysql_lambda_async procedure. This procedure, as the name implies, invokes your desired Lambda function asynchronously, and does not wait for it to complete before proceeding. As usual, you will need to give your Lambda function permission to access any desired AWS services or resources.

To learn more, read Invoking a Lambda Function from an Amazon Aurora DB Cluster.

Load Data From S3
As another form of integration, data stored in an S3 bucket can now be imported directly in to Aurora (up until now you would have had to copy the data to an EC2 instance and import it from there).

The data can be located in any AWS region that is accessible from your Amazon Aurora cluster and can be in text or XML form.

To import data in text form, use the new LOAD DATA FROM S3 command. This command accepts many of the same options as MySQL’s LOAD DATA INFILE, but does not support compressed data. You can specify the line and field delimiters and the character set, and you can ignore any desired number of lines or rows at the start of the data.

To import data in XML form,  use the new LOAD XML from S3 command. Your XML can look like this:

<row column1="value1" column2="value2" />
<row column1="value1" column2="value2" />

Or like this:


Or like this:

  <field name="column1">value1</field>
  <field name="column2">value2</field>

To learn more, read Loading Data Into a DB Cluster From Text Files in an Amazon S3 Bucket.

Available Now
These new features are available now and you can start using them today!

There is no charge for either feature; you’ll pay the usual charges for the use of Amazon Aurora, Lambda, and S3.



AWS Week in Review – October 10, 2016

by Jeff Barr | on | in Week in Review | | Comments

Twenty four (24) external and internal contributors worked together to create this edition of the AWS Week in Review. If you would like to join the party please visit the AWS Week in Review on GitHub. I am also about to open up some discussion on a simplified and streamlined submission process.


October 10


October 11


October 12


October 13


October 14


October 15


October 16

New & Notable Open Source

New SlideShare Presentations

New Customer Success Stories

Upcoming Events

Help Wanted

New AWS Marketplace Listings

Stay tuned for next week! In the meantime, follow me on Twitter and subscribe to the RSS feed.