Unbelievably it is March already, as you enter into the madness of March don’t forget to take some time and learning more about the latest service innovations from AWS. Each month, we have a series of webinars targeting best practices and new service features in the AWS Cloud.
I have shared below the schedule for the live, online technical sessions scheduled for the month of March. Remember these talks are free, but they fill up quickly so register ahead of time. The online tech talks scheduled times are shown in Pacific Time (PT) time zone.
Webinars featured this month are as follows:
Tuesday, March 21
9:00 AM – 10:00 AM: Deploying a Data Lake in AWS
10:30 AM – 11:30 AM: Optimizing the Data Tier for Serverless Web Applications
12:00 Noon – 1:00 PM: One Click Enterprise IoT Services
Wednesday, March 22
10:30 – 11:30 AM: ElastiCache Deep Dive: Best Practices and Usage Patterns
12:00 Noon – 1:00 PM: A Deeper Dive into Apache MXNet on AWS
Thursday, March 23
9:00 – 10:00 AM: Developing Applications with the IoT Button
10:30 – 11:30 AM: Automating Management of Amazon EC2 Instances with Auto Scaling
Friday, March 24
10:30 – 11:30 AM: An Overview of Designing Microservices Based Applications on AWS
Monday, March 27
9:00 – 10:00 AM: How to get the most out of Amazon Polly, a text-to-speech service
Tuesday, March 28
10:30 AM – 11:30 AM: Getting the Most Out of the New Amazon EC2 Reserved Instances Enhancements
12:00 Noon – 1:30 PM: Getting Started with AWS
Wednesday, March 29
9:00 – 10:00 AM: Best Practices for Managing Security Operations in AWS
10:30 – 11:30 AM: Deep Dive on Amazon S3
12:00 Noon – 1:00 PM: Log Analytics with Amazon Elasticsearch Service and Amazon Kinesis
Thursday, March 30
9:00 – 10:00 AM: Active Archiving with Amazon S3 and Tiering to Amazon Glacier
10:30 AM – 11:30 AM: Deep Dive on Amazon Cognito
12:00 Noon – 1:00 PM: Building a Development Workflow for Serverless Applications
The AWS Online Tech Talks series covers a broad range of topics at varying technical levels. These technical sessions are led by AWS solutions architects and engineers and feature live demonstrations & customer examples. You can also check out the AWS on-demand webinar series on the AWS YouTube channel.
Jordin Green is our guest writer today, with an inside view from the floor of HIMSS17.
Empathy. It’s not always a word you hear associated with technology but one might argue it should be a central tenet for the application of technology in healthcare, and I was reminded of that fact as I wandered the halls representing AWS at HIMSS17, the Healthcare Information and Management System Society annual meeting.
At Amazon, we’re taught to obsess over our customers, but this obsession takes on a new level of responsibility when those customers are directly working on improving the lives of patients and the overall wellness of society. Thinking about the challenges that healthcare professionals are dealing with every day drives home how important it is for AWS to ensure that using the cloud in healthcare is as frictionless as possible. So with that in mind I wanted to share some of the things I saw in and around HIMSS17 regarding healthcare and AWS.
I started my week at the HIMSS Cloud Computing Forum, which was a new full-day HIMSS pre-day focused on educating the healthcare community on cloud. I was particularly struck by the breadth of cloud use cases being explored throughout the industry, even compared to a few years ago. The program featured presentations on cloud-based care coordination, precision medicine, and security. Perhaps one of the most interesting presentations came from Jessica Kahn from the Center for Medicare & Medicaid Services (CMS), talking about the analytics platform that CMS has built in the cloud along with Nuna. Jessica talked about how a cloud-based platform allows CMS to make decisions based on data that is a month old, rather than a year or years old. Additionally, using an automated, rules-based approach, policymakers can directly query the data without having to rely on developers, bringing agility. Nuna has already gained a number of insights from hosting this de-identified Medicaid data for policy research, and is now looking to expand its services to private insurance.
The exhibition opened on Monday, and I was really excited to talk to customers about the new Healthcare & Life Sciences category in the AWS Marketplace. AWS Marketplace is a managed and curated software catalog that helps customers innovate faster and reduce costs, by making it easy to discover, evaluate, procure, immediately deploy and manage 3rd party software solutions. When a customer purchases software via the Marketplace, all of the infrastructure needed to run on AWS is deployed automatically, using the same pay-as-you-go pricing model that AWS uses. Creation of a dedicated category of healthcare is a huge step forward in making it easier for our customers to deploy cloud-based solutions. Our new category features telehealth solutions, products for managing HIPAA compliance, and products that can be used for revenue cycle management from AWS Partner Network (APN) such as PokitDok. We’re just getting started with this category; look for new additions throughout the year.
Later in the week, I tried to spend time at the number of APN Partners exhibiting at HIMSS this year, and it’s safe to say our ecosystem also had lots of moments to shine. Orion Health announced that they will migrate their Amadeus precision medicine platform to the AWS Cloud. Orion has been deploying on top of AWS for a while now, including notably the California-wide Health Information Exchange CalINDEX. Amadeus currently manages 110 million patient records; the migration will represent a significant volume of clinical data running on AWS. New APN Partner Merck announced a new Alexa skill challenge, asking developers to come up with new, innovative ways to use Alexa in the management of chronic disease. Healthcare Competency Partner ClearDATA announced its new fully-managed Containers-as-a-Service product, which simplifies development of healthcare applications by providing developers with a HIPAA-compliant environment for building, testing, and deployment.
This is only a small sample of the activity going on at HIMSS this year, and it’s impossible to capture everything in one post. You learn more about healthcare on AWS on our Cloud Computing in Healthcare page. Nonetheless, after spending four days diving in to healthcare IT, it was great to see how AWS is enabling our healthcare customers and partners deliver solutions that are impacting millions of lives across the globe.
In order to make sure that the AWS Blog is meeting your information and entertainment needs, we are planning to conduct some usability panels later this month. We are looking for a mix or local (Seattle) and remote participants with any level of experience reading the blog and/or using AWS. If you participate in a usability panel, you’ll receive an Amazon.com gift card as a token of our appreciation.
If you are interested in participating, sign up today.
We launched EC2 Run Command late last year and have enjoyed seeing our customers put it to use in their cloud and on-premises environments. After the launch, we quickly added Support for Linux Instances, the power to Manage & Share Commands, and the ability to do Hybrid & Cross-Cloud Management. Earlier today we made EC2 Run Command available in the China (Beijing) and Asia Pacific (Seoul) Regions.
Our customers are using EC2 Run Command to automate and encapsulate routine system administration tasks. They are creating local users and groups, scanning for and then installing applicable Windows updates, managing services, checking log files, and the like. Because these customers are using EC2 Run Command as a building block, they have told us that they would like to have better visibility into the actual command execution process. They would like to know, quickly and often in detail, when each command and each code block in the command begins executing, when it completes, and how it completed (successfully or unsuccessfully).
In order to support this really important use case, you can now arrange to be notified when the status of a command or a code block within a command changes. In order to provide you with several different integration options, you can receive notifications via CloudWatch Events or via Amazon Simple Notification Service (SNS).
These notifications will allow you to use EC2 Run Command in true building block fashion. You can programmatically invoke commands and then process the results as they arrive. For example, you could create and run a command that captures the contents of important system files and metrics on each instance. When the command is run, EC2 Run Command will save the output in S3. Your notification handler can retrieve the object from S3, scan it for items of interest or concern, and then raise an alert if something appears to be amiss.
Monitoring Executing Using Amazon SNS
Let’s run up a command on some EC2 instances and monitor the progress using SNS.
Following the directions (Monitoring Commands), I created an S3 bucket (jbarr-run-output), an SNS topic (command-status), and an IAM role (RunCommandNotifySNS) that allows the on-instance agent to send notifications on my behalf. I also subscribed my email address to the SNS topic, and entered the command:
And specified the bucket, topic, and role (further down on the Run a command page):
I chose All so that I would be notified of every possible status change (In Progress, Success, Timed Out, Cancelled, and Failed) and Invocation so that I would receive notifications as the status of each instance chances. I could have chosen to receive notifications at the command level (representing all of the instances) by selecting Command instead of Invocation.
I clicked on Run and received a sequence of emails as the commands were executed on each of the instances that I selected. Here’s a sample:
In a real-world environment you would receive and process these notifications programmatically.
Monitoring Execution Using CloudWatch Events
I can also monitor the execution of my commands using CloudWatch Events. I can send the notifications to an AWS Lambda functioon, an SQS queue, or a Amazon Kinesis stream.
For illustrative purposes, I used a very simple Lambda function:
I created a rule that would invoke the function for all notifications issued by the Run Command (as you can see below, I could have been more specific if necessary):
I saved the rule and ran another command, and then checked the CloudWatch metrics a few seconds later:
I also checked the CloudWatch log and inspected the output from my code:
This feature is available now and you can start using it today.
Monitoring via SNS is available in all AWS Regions except Asia Pacific (Mumbai) and AWS GovCloud (US). Monitoring via CloudWatch Events is available in all AWS Regions except Asia Pacific (Mumbai), China (Beijing), and AWS GovCloud (US).
I am a firm believer in the value of continuing education. These days, the half-life on knowledge of any particular technical topic seems to be less than a year. Put another way, once you stop learning your knowledge base will be just about obsolete within 2 or 3 years!
In order to make sure that you stay on top of your field, you need to decide to learn something new every week. Continuous learning will leave you in a great position to capitalize on the latest and greatest languages, tools, and technologies. By committing to a career marked by lifelong learning, you can be sure that your skills will remain relevant in the face of all of this change.
Keeping all of this in mind, I am happy to be able to announce that we will be holding an AWS DevDay in San Francisco on June 21st.The day will be packed with technical sessions, live demos, and hands-on workshops, all focused on some of today’s hottest and most relevant topics. If you attend the AWS DevDay, you will also have the opportunity to meet and speak with AWS engineers and to network with the AWS technical community.
Here are the tracks:
- Serverless – Build and run applications without having to provision, manage, or scale infrastructure. We will demonstrate how you can build a range of applications from data processing systems to mobile backends to web applications.
- Containers – Package your application’s code, configurations, and dependencies into easy-to-use building blocks. Learn how to run Docker-enabled applications on AWS.
- IoT – Get the most out of connecting IoT devices to the cloud with AWS. We will highlight best practices using the cloud for IoT applications, connecting devices with AWS IoT, and using AWS endpoints.
- Mobile – When developing mobile apps, you want to focus on the activities that make your app great and not the heavy lifting required to build, manage, and scale the backend infrastructure. We will demonstrate how AWS helps you easily develop and test your mobile apps and scale to millions of users.
We will also be running a series of hands-on workshops that day:
- Zombie Apocalypse Workshop: Building Serverless Microservices.
- Develop a Snapchat Clone on AWS.
- Connecting to AWS IoT.
Registration and Location
There’s no charge for this event, but space is limited and you need to register quickly in order to attend.
All sessions will take place at the AMC Metreon at 135 4th Street in San Francisco.
Many of my colleagues will be heading to San Francisco next week for the RSA conference. In order to make your time in San Francisco even more worthwhile, you may want to consider attending some security sessions at the AWS Pop-up Loft on Market Street (a short walk from Moscone Center). The talks (from our team and from our customers) will cover a wide variety of topics including securing IoT devices and applications, protecting millions of customers, and complying with government regulations.
Here is the schedule:
On Monday, February 29th the AWS Security, Risk, Compliance, Security Solution Architecture, and Professional Services teams will be holding a happy hour / mixer from 3 PM to 5 PM. You can meet members of the teams learn about our security assurance automation program, ask questions, learn about opportunities on the team and find out more about our their plans. The first 100 attendees can also grab a free T-shirt!
On Tuesday and Wednesday, we have a full agenda:
|Tuesday, March 1, 2016||Wednesday, March 2, 2016|
|10 AM – 11 AM||Securing Things with Other Things (Don “Beetle” Bailey, AWS)||Security Incident Response Simulations (SIRS) with Coinbase (Jon Miller, AWS / Armando Leite, AWS / Rob Witoff, Coinbase)|
|11 AM – Noon||Defending, Detecting, and Blocking Hardware and Firmware Attacks (Ted Reed, Facebook)||Security by Design – Modernizing Technology Governance in the Cloud (Tim Sandage, AWS)|
|Noon – 1 PM||Lunch (Provided)||Lunch (Provided)|
|1 PM – 2 PM||Inception: Using the Cloud to Secure Data Before, During, and After the Cloud (Adam Ghetti, Ionic Security)||Scaling Security for Your First 10 Million Customers (Hart Rossman, AWS & Bill Shinn, AWS)|
|2 PM – 3 PM||DevSecOps with Stelligent (Matt Bretan, AWS & Henrik Johansson, AWS)||Data integrity in GxP Systems Using AWS Products (Chris Walley, AWS)|
|3 PM – 4 PM||Logs Logs Logs and what to do with them (Will Kruse, AWS)||Automating Industry Best Practices for Securing Your Cloud (Blake Frantz, AWS)|
|4 PM – 5 PM||Security Team Meetup||Programmatic Security in AWS (John Martinez, Evident.io)|
|6 PM – 9 PM||
Shared Responsibility Model for Cloud Security (best practices presentations and panel discussion)
The sessions are free and you need not register in advance. Space is limited so you’d best get there early!
Late last year I traveled to Barcelona and delivered the keynote address at one of the final AWS Summit events of 2015.
My re:Invent recap was well received; the AWS users and partners in the area were excited by our newest services and left the session eager to learn more and to put them to use. As part of the keynote session, attendees learned about AWS security from my colleague Bill Murray and also had the opportunity to hear how several of our customers had used AWS.
After the keynote, attendees spent the afternoon attending deep-dive technical sessions, exploring the partner area, and meeting with each other. We wrapped up the day with snacks, drinks, and a Battle of the Bands. It was wonderful to meet so many AWS users and to learn more about their businesses and their applications.
Summits for 2016
We are now making plans for our 2016 Global Summit series. As in past years, we will focus on education—we want to make sure that you know as much as possible about whatever aspects of AWS are of interest to you, and we want you to leave the summit charged up and ready to build something new, cool, and powerful!
To see the list of cities and dates, check out the AWS Summits page. Click on the Want More Information button to express your interest in a local Summit and to receive registration information as it becomes available.
Last week, I had the opportunity to attend and speak at TIA 2013 The Third Annual Conference of the Telecommunications Industry Association (TIA), which is the leading trade association representing the global information and communications technology (ICT) industry through standards development, policy initiatives, business opportunities, market intelligence and networking events.
The 3 1/2 day conferences theme was The Future of The Network and there were an interesting lineup of keynote speakers to cover and lead insightful discussions around the future of network and telecommunications and obviously, the cloud was the center of almost all discussions.
I had the opportunity to speak with Bryson Koelher, Executive Vice President and Chief Information Officer of The Weather Company (TWC). I thought it would be cool to let the TIA audience know about a real perspective from a real customer running real mission-critical workloads on AWS.
Until I started to work on the keynote deck, I have to admit I was not aware of how big The Weather Company is. I did know about The Weather TV Channel and very frequently visited the Weather.com website. I also know that it provides millions of people every day with the world’s best weather forecasts, content and data, connecting with them through television, online, mobile and tablet screens.
What I did not know was that, through its acquisitions, it is also the worlds leading aviation weather service provider. 50,000+ flights daily (85% of all major US airlines and 30% of Top 100 global airlines) get data from TWC embedded forecasters, systems and tools. It is also the worlds leading energy forecaster and provides data and insights to insurance and other industries. TWCs WeatherFX product suite provides a mashup of sales and weather data to analyze and predict product demand based on weather. Weather impacts supply demand and prices. For example, lack of rainfall in Scandinavia impacts reservoirs, thereby increases power prices; unexpected wind reduction in Texas taxes the grid and increases the power prices; extended blast from cold Siberia might route L&G shipments from Asia to Europe. The Weather company products like WSI enables energy companies to make faster and smarter trading decisions from wind generation forecasts impacting the power demand to global temperature forecasts impacting the gas demand. TWCs WeatherUnderground is getting data from 30,000 personal weather data stations and the site is one of the top 5 web properties and manages a vibrant community of weather enthusiasts that share and consume data and media every day.
Bryson and I delivered a conversational keynote speech at TIA and discussed different strategies of how Enterprises are leveraging the AWS cloud today. I would love to share with you the key takeaways from the presentation.
Watch the Keynote
Weather Changes, So Must IT
Like the weather, our business requirements are changing. With changing times, change in strategy is required. For TWC, in order to cope up with weather changes and continue to deliver worlds best accurate forecast, they are doing massive changes throughout the organization to be cloud-ready. TWCs transformation into a large scale, global, big data platform for the worlds weather is founded on an All-In IaaS strategy.
There are multiple ways to get there
Like the traditional enterprise, TWC has a large suite of old and new applications, acquired through multiple acquisitions, targeted at developers, consumers and enterprises. For any enterprise that has a range of different types of applications, they are multiple strategies to leverage the cloud:
- Strategy 1: Build All New Apps and Services in the Cloud. Enterprises are seeing real benefits of building the 21st Century Cloud Architectures. All of TWCs new applications are being purpose-built to run exclusively in the cloud.
- Strategy 2: Augment On-Premises IT Resources with Cloud Capacity. Enterprises are extending their on-premises data centers to the cloud to make their apps better and building new apps that integrate with on-premises apps. TWCs highly mission-critical next generation forecast platform powers Weather.com, The Weather TV Channel which integrates with legacy last mile STARs cable network to deliver data to Local on the 8s network and most of their other enterprise-facing products like WSI. This platform runs on AWS, uses a number of services and is designed to scale by demand as well as scale to improve the accuracy of the 3-5-7 day and 6-hour forecasts by analyzing multiple different sources of data.
- Strategy 3: Migrate Existing Apps & Data to the Cloud. TWC is seeing great performance and cost benefits by migrating existing applications such as weatherunderground.com Radar, Maps and Photos. During Hurricane Sandy, they were able to scale the application to 170 instances in order to handle a huge surge in traffic and maintain the 5-15 ms. response time – something that they were not able to do in the past with fixed infrastructure.
We are using all 3 strategies simultaneously across the enterprise to get us there lightning fast!..
TWCs journey to the cloud is a phased-driven process. They are seeing tremendous benefits such as speed and agility, reduced cost, scalability and elasticity in AWS and executing on all the 3 strategies described above to an all-in AWS strategy.
Watch the Video Interview
In a seperate video interview after the keynote, Bryson also discussed how the company is transforming itself around the cloud.
What really sets Amazon apart is the services they provide Theres so many services that Amazon provides that were able to tap into and look at and then not have to rebuild, and again, Focus our time on really what differentiates us. Amazons breadth of service offerings is probably the best asset that theyve got to help us.
Like The Weather Company, there are several enterprises undergoing massive transformations and AWS and Cloud Computing is center of all these transformations.
Checkout the Enterprise IT Track at AWS re:Invent 2013 to learn more about how other enterprises are leveraging different AWS services.
After four months of planning and over 500 emails, I am less than 48 hours away from the start of my 5,000 mile road trip!
I have created the AWS Road Trip site to give you the opportunity to follow my journey. I will be posting photos, videos, maps, and more as I make my way from Boston to Seattle, with Austin as the southern extreme and San Francisco to the west. The clever and generous folks at MapBox have created a very nice interactive map of my journey.
Of course, this map (and many other parts of my trip) are powered by AWS. To learn more about how EC2, CloudFront, S3, and the Simple Email Service work together to create beautiful maps, read Will White’s post, How We Serve Faster Maps from MapBox.
The AWS Road Trip site is hosted in Amazon S3, with DNS provided by Route 53. Content is generated in Octopress and pushed to S3 using S3cmd (see the first post for more information). I will write more about my setup after I return from my trip.
Some of the user groups still have space available (again, see the first post for more information). If I am speaking in your city,sign up now in order to secure a spot for yourself. I have a fresh new presentation, hundreds of AWS T-shirts, some stickers, and some other goodies to distribute.
As much as I would like to, I won’t have time for any other meetings along the way. If you’d like to meet me in person, plan on coming to the user group in your city. I hope to see you there!
Have you ever been to a Startup Weekend event? In the space of 54 hours starting on a Friday evening (here’s a typical schedule), participants pitch business ideas, form teams, implement their ideas, and present their work before a panel of judges. Along the way there’s time for some coaching and a quick lesson in how to give a pitch, but most of the time is reserved for flat-out working with your team. The events are a lot of fun, and a good way to see the entire business formation process on an accelerated timescale.
At the end of the event some of the teams remain intact and go on to turn their weekend project into a fully functional business. A number of companies, including Rover.com (dog boarding), Zaarly (local shopping), and Look.io (mobile chat, acquired by LivePerson) were built on AWS during past Startup Weekend events.
I am happy to announce that AWS is now a Global Technology Sponsor of Startup Weekend! I have to say that I am genuinely excited by this announcement. I’ve been to a couple of events and I’ve seen the process in action. The events attract an eclectic mix of entrepreneurs, developers, marketers, and so forth. Everyone participates; there’s no “stand around and watch” option. The energy level is high and one-time strangers form into teams and do some awesome work together.
Every Startup Weekend participant will receive $100 in AWS credits that can be applied to an existing AWS account or used with a new account (create one here if you need one). These credits can be used for a wide variety of AWS services including EC2, S3, and DynamoDB and are valid for 12 months from the date of activation.