Category: Nonprofit

Landsat on AWS: Half a Year, Half a Billion Requests

A few weeks ago, we had the chance to attend the world’s largest gathering of earth scientists at the American Geophysical Union’s Fall Earth Sciences Conference.

More and more, research in the areas of climate change, agricultural resilience and space exploration rely on access to computing resources in the cloud. Because the cloud makes it easy to share massive amounts of data and allows them to only pay for the computing resources they need, they can accelerate their pace of research while reducing costs.

Last year at the AGU Conference, we announced Landsat on Amazon Web Services (AWS), a service to make Landsat data available for anyone to access from Amazon Simple Storage Service (Amazon S3). Today, over 250,000 Landsat 8 scenes are freely available from Landsat on AWS. All Landsat 8 scenes from 2015 are available, along with a selection of cloud-free scenes from 2013 and 2014. All new Landsat 8 scenes are made available each day (~680 per day), often within hours of production.

Available near on-demand IT resources

Landsat on AWS is designed to allow fast access to Landsat data via a RESTful interface, reducing the time required for analysis. The data shared via Amazon S3 can be transferred programmatically and quickly to AWS cloud computing resources, and researchers can analyze data without needing to download it or store it themselves.

Landsat on AWS makes each band of each Landsat scene available as a stand-alone GeoTIFF, and scene metadata are available as text and JSON files. These individual files allow efficient and targeted data access. Landsat on AWS GeoTIFFs have “internal tiling,” which allows users to use HTTP range GET requests to access 512-pixel squares within each scene. This allows highly targeted access to data based on geography.

Half a year, half a billion requests

Within the first 150 days of the launch of Landsat on AWS (19 March 2015 to 16 August 2015), Landsat imagery and metadata were requested over 500 million times, globally.

The most requested WRF PATH/ROW combination is 040/036, which includes the southern California high desert and the location of the 2015 Lake Fire. The scar of the fire is rust colored in the visualization below, which is based on data acquired on 15 July 2015. This false color composite visualization was made in minutes with Snapsat, a web application built on AWS.

AGU attendees who learned about Landsat on AWS were eager to start using it themselves, sharing it with their students, or in using Amazon S3 to share similar data sets.

Learn how to access Landsat on AWS at


New Tools for Using Real-Time and Archived NEXRAD Weather Data on AWS

In October, we announced that the real-time feed and full historical archive of original resolution (Level II) NEXRAD data is freely available on Amazon Simple Storage Service (Amazon S3) for anyone to use. The Next Generation Weather Radar (NEXRAD) is a network of 160 high-resolution Doppler radar sites that enables severe storm prediction and is used by researchers and commercial enterprises to study and address the impact of weather across multiple sectors.

Early adopters have used the data to cut their product development time and ask new questions about weather-related phenomena. Today, we’re excited to share two new tools that make it even easier for you to analyze NEXRAD data and incorporate it into your workflows.

WeatherPipe for archive analysis

Before NEXRAD on AWS, it was impossible to access the full NEXRAD Level II archive on demand. This limited the types of analysis that researchers could perform with the data. Dr. Michael Baldwin, an Associate Professor in the Department of Earth, Atmospheric, and Planetary Sciences at Purdue University, recalls his difficulties with obtaining funding for NEXRAD-related research because it was deemed “technically impossible” to get enough data to perform the analysis.

He said, “As soon as I heard about NEXRAD on AWS, I got very excited about the impact for science. Having the archive available on demand on AWS opens a new world of possibilities. I’m excited to dust off that proposal and incorporate NEXRAD into my research.”

Baldwin turned to his colleague Stephen Harrell to help make it easier for students and researchers to analyze the NEXRAD data. This led to the development of WeatherPipe, an open source java tool that streamlines the process of running a MapReduce job with NEXRAD data on AWS.

WeatherPipe marshals the NEXRAD data into usable data structures and runs the job in Amazon Elastic MapReduce (EMR). The output is a NetCDF file that you can display in Unidata’s Integrated Data Viewer (IDV) and other visualization tools.

Harrell, who works in Purdue’s research computing office and is completing a degree in the Computer Science department, worked with three classmates (Lala Vaishno De, Hanqi Du, and Xiaoyang Lin) to develop the WeatherPipe prototype in a matter of weeks. They’ve open sourced it to allow anyone to use the tool and contribute to the code.

Currently, the tool produces average radar reflectivity over time. Next, Harrell and Baldwin plan to use the tool to run more advanced and specific analyses, such as storm identification and classification. Ultimately, Baldwin wants to create a predictive model for high-impact weather events, such as tornadoes.

Notifications for event-based processing

For many NEXRAD users, it’s important to get new data as soon as it’s available. This is true for both the “volume scan” archive files (the data collected by the Doppler radar site as it scans the atmosphere) and the “chunks” data (smaller packages of data that are quickly transmitted as a real-time feed).

One of the top requests from early users was for an easier way to incorporate the NEXRAD data into event-driven workflows. Today, we’re excited to announce that notifications are now available for both types of data.

We have set up public Amazon Simple Notification Service (SNS) topics for the “chunks” and archive data that create a notification for every new object added to the Amazon S3 buckets. To start, you can subscribe to these notifications using Amazon Simple Queue Service (SQS) and AWS Lambda. This means you can automatically add new real-time and near-real-time NEXRAD data into a queue or trigger event-based processing if the data meets certain criteria, such as geographic location.

Visit our NEXRAD on AWS page for information on subscribing to these SNS topics and incorporating them into workflows. We’re excited to see what you do with this new capability!

Getting started with NEXRAD on AWS

In addition to these new tools, you can find tutorials from Unidata, The Climate Corporation, and CartoDB on our NEXRAD on AWS page to help you get started using NEXRAD on AWS. Unidata has also made the NEXRAD Level II archive data available via their THREDDS Data Server and you can also browse the archive contents via the AWS JavaScript S3 Explorer.

Educators, researchers, and students can also apply for free AWS credits to take advantage of the utility computing platform offered by AWS, along with public data sets such as NEXRAD on AWS. If you have a research project that could take advantage of NEXRAD on AWS, you can apply for an AWS Grant.

We’d love to feature more tools and stories. Tell us how you’re using the data via the NEXRAD on AWS page!

(Big) Data Driven Politics: 800m Data Points

Earlier this week you heard MPAC’s journey to the cloud from the stage at AWS re:Invent, and today we are excited to share with you Quorum’s story.

Think big data: everything political from bills, votes, tweets, letters, and more.

With an election year coming up, politics are on the center stage. Whether you vote red or blue, there is much more that goes on behind the scenes that is driving politics today.

All of these questions amount to data. Every hearing, every bill, every vote, every tweet creates more and more data. And all of this data can be analyzed to reveal telling trends that help inform politics and build legislative strategies. However, without the use of technology, a lot of this data is mined by hand, resulting in time loss and low productivity.

This frustration led to the creation of Quorum. Quorum is an online legislative strategy platform that provides legislative professionals access to the world’s most comprehensive database of legislative information, with quantitative insights and modern project management tools making it easier to track legislation, build support, and take action.

As a boot-strap, born in the cloud start up comprised of 10 employees, Quorum had big ambitions for their large amounts of data that had to be accessed quickly, easily, and at a low cost. So Quorum turned to the AWS Cloud.

Data first

Two years ago when this project began, there was no central application program interface (API) at either the federal or state level to collect this type of data. There was no ability to search and find out what people were saying on the floor or committee hearings. Quorum set out to solve this problem and create transparency and knowledge of political data.

The first step to their project was to find all of the relevant information and build the world’s most comprehensive database of political data. In order to do this, they built a centrally-managed, easily-searchable database.

For example, with Quorum, it is now possible to search all different press releases from all members of congress with ease and at a click of a button. No longer do you have to weed through stacks of paper or search across sites in different formats.

Realizing the benefits of technology, Quorum relied on AWS to give them the power and flexibility to get the data needed and to eventually analyze it quickly and comprehensively in real time. Once you have the access to data you can figure out what it means, and then use it to inform and expand your decisions.

Analytics second

Powered by AWS, Quorum is able to leverage compute power from the outset, so instead of just having access to the data, Quorum is able to give insight into political trends from the big data produced.

With all of the data collected, Quorum built a quantitative analytics layer on top of it. They built analytic tools and data set filters to find insights. Quorum’s algorithms process over 800 million data points to calculate hundreds of different statistics about each member of Congress’ legislative history.

Additionally, there are a number of tasks that can be automated using computer science. Quorum offers tools that enable users to automatically identify all changes between various versions of a bill, create legislative scorecards on votes, bills, and amendments, and rank all 435 Congressional districts by over 1,000 demographic statistics. Quorum also facilitates legislative tracking, allowing users to easily assign members of Congress to lists and keep track of notes and outreach to each of the 535 members.

By creating a comprehensive database of legislative information, Quorum helps congress benefit from modern technology and data. Watch this video to hear from Jonathan Marks, Cofounder of Quorum, and learn more about Quorum and the cloud.


“Because We Had No Choice”: MPAC Achieved Over 75% Savings

At this year’s AWS re:Invent, our annual user conference, we featured various customers to speak to the public sector audience in attendance. The morning was kicked off by Teresa Carlson, who you heard from in our first blog, and then three of our customers took to the stage to address their challenges, and how they used AWS to help achieve their mission.

Have you ever felt like you have been given an impossible challenge?

Nicole McNeill, CFO and VP of Corporate Services for the Municipal Property Assessment Corporation (MPAC) was tasked to cut the business cycle in half with no budget dollars allocated and no history or proven techniques to pull from. She and her team did it, because they had no choice. In order to survive, they needed to evolve.

The Municipal Property Assessment Corporation (MPAC) is an independent, not-for-profit corporation funded by all Ontario municipalities. Their role is to accurately value and classify all properties in Ontario according to the Assessment Act and regulations established by the Ontario Government. They are the largest assessment jurisdiction in North America, assessing and classifying more than five million properties with an estimated total value of $2.2 trillion.

Back in 2013, they launched a new strategic plan with bold goals to do more with less, and drive efficiency and effectiveness for the citizens they serve. Their goals were built around trying to save $20 million and provide better public services. And they had just four years to do it.

Getting started

Their IT landscape consisted of a staff of 160 people with a $30 million budget. They were paralyzed at times with the “lights on” and “care-and-feeding” of existing infrastructure.  Only five percent of their resources and time were spent on innovation, and 95% was focused on run and build. There was a fear of failure and desire for certainty that drove an anti-culture for change. The systems either worked or predictably didn’t work, and the existing IT systems worked around it. The greatest challenges and greatest opportunities were often left unsolved.

Innovation and evolution through cloud became more obvious and necessary for their own survival. This prompted their first experiment.

To improve their customer-facing portal for over five million property owners in Ontario, a handful of IT staff set up an AWS account using a purchasing card at an average of $50 to $100 per month.  They believed they could do it faster and better than they ever had before.

They turned to AWS and built an architecture that was open source, cloud, and secure.

And the results?

Within only three months, they had achieved over 75% savings, they had over 200 user accounts, two million inquiries, and a privacy-by-design framework. Based on that success, they made an executive-level commitment to the cloud. They started on the journey to 100% cloud, zero-owned infrastructure, on-demand, and aimed to dramatically lower IT costs.

The journey continues

With the first success under their belt, the next challenge was to cut the business cycle in half. At the time, their data was decentralized with multiple evaluation platforms causing painful system performance at times. It took 40 hours to value 80,000 properties, whereas, they needed to value all five million properties in four hours.

Therefore, they worked toward what they called the “one version of the truth.”

In order to get to this one version of the truth, MPAC wanted to re-architect their platforms. With any experiment, they faced a few challenges but found solutions to overcome them.

  • Challenge: One data warehouse for data integrity and to enhance speed—Solution: Amazon RedShift.
  • Challenge: One service-enabled evaluation engine—Solution: Open source components running on Amazon EC2.
  • Challenge: Meet privacy statutes—Solution: AWS approved by legal counsel and internal privacy commissioner.
  • Challenge: Monitor speed and performance—Solution: AWS CloudWatch, Elastic Load Balancing, and Auto Scaling.

The moral of the story

MPAC worked hard on their strategic journey. To overcome their challenges and achieve their mission, it was the responsibility of the leadership to encourage experimentation and embrace failure. With little budget and little time, they still demanded innovation.

This innovation led them to become a business-value IT team that worked 5000% faster at one-tenth of the cost.  They took the work out of running IT, and instead became consumers of services.

When faced with the impossible, MPAC learned three main lessons.

  1. Encourage experimentation
  2. Cut time
  3. Cut budget

Scarcity creates demand, demand creates innovation, and with innovation you can achieve the seemingly impossible.

To learn more about MPAC, their mission, and how AWS helped them achieve it, watch this video of Nicole McNeill addressing the public sector team at AWS re:Invent.

Watch the full MPAC Case Study here.

Millions of Students in 180 Countries Participate in Hour of Code

During December 7-13, in celebration of Computer Science Education Week, tens of millions of students in 180+ countries participated in’s “The Hour of Code” powered by Amazon Web Services (AWS). The Hour of Code is a one-hour introduction to computer science, designed to demystify code and show that anybody can learn the basics.

To show support at local schools in the Washington, DC area, AWS had the opportunity to participate in coding events held at Lafayette Elementary School in Washington DC and Patrick Henry Elementary School in Arlington, VA with over 600 students.

Coding in the classroom

With volunteers from AWS, kindergarten through fifth grade students got to participate in an Hour of Code.  AWS volunteers walked the students through coding activities in the form of Minecraft, Star Wars, Frozen, and Angry Birds. Students had the opportunity to code the real-life games they play. They solved puzzle challenges, worked through mazes, built houses, and more— all on the computer. Children of all ages were matched with activities for their age and skill-level, and at the end of the hour session they were able to explain what coding is and how it is used.

As volunteers, we had the opportunity to see the kids’ faces light up as they completed challenges they once thought were too difficult. Seeing them overcome the obstacles and get excited about coding showed how coding can be fun and kids of all ages can do it.

Students left energized, inspired, and wanting to code more, at home and at school. One of the goals of The Hour of Code is to introduce coding to students at a young age and give them the resources needed to learn how to code. By helping students as young as five understand what coding is, the real-life use cases, and how to get started, it encourages them to engage in technology and potentially consider careers in technology. With more resources made available to the students, the more they want to continue to learn.

The Hour of Code is geared toward young people, and school-aged children, but anyone can participate and begin to learn the coding basics. Volunteers, teachers, and students all got involved during the day.

Scaling to meet demand

The events at Lafayette and Patrick Henry Elementary Schools were only two of the thousands of sites participating around the globe. needed the ability to handle extreme spikes in traffic as students logged on and participated in the activities worldwide. receives one whole year’s worth of typical traffic in one month. They needed to scale to meet this kind of demand, so their site could withstand that kind of surge in usage reaching tens of millions of students.

Using Amazon CloudFront, hosts the tutorials all over the world with the help of the AWS global infrastructure and multiple regions to plan for redundancy and high availability. So wherever you are in the world and no matter how many people are on the site, you can access the tutorials quickly and more responsively.

Start coding today!

Welcome to the AWS Innovating in the Public Sector Blog

A post by Teresa Carlson, Vice President of Amazon Web Services Worldwide Public Sector


I am thrilled to welcome you to Amazon Web Services’ “Innovating in the Public Sector” blog! This blog is all about you, our public sector customers, and will be your place for all public sector-specific content.

I’ve had the pleasure of traveling to many parts of the world – from Singapore to Bahrain to Sao Paulo to all over the U.S – to meet with local and national government agencies, educational institutions, and nonprofit organizations. I kept hearing a lot of the same challenges, needs, and best practices that could be leveraged all over the world, so we decided to dedicate this space to sharing what we’ve learned with you.

Government, education, and nonprofits are faced with unique challenges, requirements, and missions. We want to bring you the latest content on the topics that matter most to you. From security to criminal justice to educational research, each week our dedicated team covering national, regional, and local government, education, and nonprofits will share insights with you.

By using the cloud, our customers are paving the way for innovation and making the world a better place. Some examples include:

Cloud computing changing the game in the public sector

Cloud is becoming the new normal around the world. Currently we have 2,000+ government agencies, 5,000+ educational institutions, and 17,500+ nonprofit organizations changing the game with the AWS Cloud.

Whether it is for development and testing, enterprise applications, high-performance computing, storage, disaster recovery, web, mobile, and social apps, virtual desktops, or data center migrations, government agencies, education institutions, and nonprofits are using AWS to help achieve their missions.

Also, we have a strong and growing community of partner companies that offer a wide range of products and services on the AWS platform to address virtually any use case.

Instead of buying, owning, and maintaining your own data centers and servers, organizations can acquire technology such as compute power, storage, databases, and other services on an as-needed basis (similar to how consumers flip a switch to turn on lights in their home and the power company sends electricity).

Working together to bring you the resources you need, on the topics you care about

Check out the video from our public sector breakfast at this year’s AWS re:Invent where I share the AWS public sector footprint around the world, the partners who make it possible in an industry moving in the right direction for cloud, and the latest customer innovations using these technologies.

From the City on a Cloud, #smartisbeautiful, and the latest data sets released, hear about the innovative ideas driven in the public sector below.

  • #smartisbeautiful. We are working to encourage girls and women to pursue computer science and working with our university partners to create computer science organizations for the women on their campuses.
  • Latest Data Sets. We host a selection of public data sets that demonstrate the power of open data on the cloud. These are accessible at no cost, and drive new businesses, accelerate research, and improve lives. By hosting key public data sets, AWS can enable more innovation, more quickly, creating additional opportunities for public good.
  • City on a Cloud. From healthcare to utilities and from transit to city planning, local and regional governments are embracing innovation.  Take a look at what these agencies and their partners are doing to move government forward on behalf of their citizens.

As we begin to publish content weekly, I am reminded that great projects has to start with great partnerships, so we want to understand how we can better serve you. Reach out to us at with questions or comments.

And receive the latest updates from AWS government and education by following us on Twitter at @AWS_Gov and @AWS_Edu.