Category: Education


20x: More Mission for the Money at UMUC

After hearing from our nonprofit and government customers over the last month (NGA, Quorum, and MPAC), we want to share a story from the education world explaining how University of Maryland University College (UMUC) was able to save $500,000 annually and see 20x performance increase by building a new analytics platform using Amazon Redshift.

UMUC’s mission is to educate working adults, many of whom juggle full-time jobs, family responsibilities and, often, military service around the world. It is the largest public, online university in the United States, offering more than 95 career-relevant programs in some of today’s most in-demand fields. To meet the needs of their busy students, UMUC launched a big-data strategy aimed to power IT infrastructure and run analytical platforms. The result was significant cost savings and a major cultural shift in their organization.

Like many companies and universities, UMUC relied mostly on legacy applications. However, when it came time to upgrade and refresh these applications, they saw the market moving to the cloud because of the many benefits it offered. A wide product set, ability to scale, and pace of innovation led UMUC to choose AWS for the cloud. By using AWS, UMUC improved the performance of its analytics platform by twentyfold and enabled its engineers to focus on building new applications instead of managing IT infrastructure. The university now runs both academic and administrative functions on AWS services including:

From Administrators to Engineers

Security concerns are a major priority for all universities. They must manage student data and know who has access and how they have access to that data. According to UMUC, security and compliance should not be a barrier to moving to the cloud. Rather, it is a strategic reason to make the move.

“The security that AWS has in place is more far-reaching, rigorous, and thought out than any company or university could do themselves,” said Darren Catalano, vice president of analytics at UMUC.

With security in the forefront, UMUC also looked at cost savings. The price-value proposition they get from Amazon RedShift is “incredible” and they are leveraging not only cost savings, but also a host of performance gains compared to their expensive legacy applications.

A major shift in the organization since moving to the cloud has been transitioning employees from administrators to engineers. They have moved from “racking and stacking” to “monitoring and building” applications. Employees are moving up in the value chain by performing value-added tasks. From an IT perspective, the benefits are less about the cost savings and more about no longer dealing with the headaches of disaster recovery, back-up, and administration.

AWS allows UMUC to be innovative, scalable, cost effective, and game changing, and they are at the forefront of technology benefiting both their students and their faculty.

To learn more about how AWS can help your big data needs, visit our Big Data details page: http://aws.amazon.com/big-data/.

Public Sector Customers Excited About the New AWS Region Announcements

To kick off the New Year, the AWS Worldwide Public Sector team is excited about the announcement of our new region in the Republic of Korea and the preannouncement of the Canada region last week.

The AWS Cloud operates 32 Availability Zones within 12 geographic Regions around the world, with 11 more Availability Zones and 5 more Regions coming online throughout the next year in Canada, China, India, Ohio, and the United Kingdom (see the AWS Global Infrastructure page for more info).

The region-based AWS model has proven to be a good standard for our government, education, and nonprofit customers around the globe. Due to the unique needs of these public sector organizations, we understand that it is important to exercise complete control over where your data is stored and where it is processed.

Now, Korean-based developers and organizations, as well as multinational organizations with end users in Korea, can securely store and process their data in AWS in Korea with single-digit millisecond latency across most of Korea.

Governments, multi-national corporations, and international organizations are at significant crossroads, trying to balance innovation and security. They want the elasticity, scalability, and total cost of ownership (TCO) of cloud computing, but they also must meet significant security requirements to protect data and personal privacy.

With the launch of the AWS Region on Korean soil, public sector organizations will now have the opportunity to move sensitive and mission-critical workloads to AWS.

The Seoul Region consists of two Availability Zones (AZs) at launch. Each AZ includes one or more geographically distinct datacenters, each with redundant power, networking, and connectivity. Each AZ is designed to be resilient to any issues in another AZ, enabling customers to operate production applications and databases that are more highly available, fault tolerant, and scalable than would be possible from a single datacenter.

Additionally, this investment in the Asia Pacific area will enable increased innovation and collaboration in education, nonprofits, scientific computing, and open data efforts.

Public sector customers will find the new AWS Region has services and features like AWS Identity and Access Management (IAM) and AWS Trusted Advisor that can enable secure information technology operations, whether they are managing health records, building out new digital services for citizens, or looking for new ways to collaborate with colleagues. Beyond these security services, public sector customers should also enjoy the elasticity and affordability of our compute, networking, storage, analytics, and database web services. To learn more about AWS Cloud Security, visit here.

Investing in the future of cloud

AWS is also delivering its AWS Educate Program to help promote cloud learning in the classroom with eight local universities, including Sogang University, Yonsei University, and Seoul National University. Since its launch locally in May 2015, over 1,000 Korean students have participated in AWS-related classes and nonprofit e-learning programs, such as “Like a Lion.”

With the launch of the Seoul Region, it marks the fifth AWS region in the Asia Pacific area, bringing the global total of regions to 12 (with more to come in 2016!).

For more details about this announcement, please see the official posting here.

1776: Where Revolutions Begin

The year 1776 is celebrated in the United States as the official beginning of the country’s freedom, with the Declaration of Independence issued on July 4.

Taking this year as inspiration for its namesake, 1776 is a global incubator and seed fund helping startups transform industries that impact millions of lives every day— in the areas of education, energy and sustainability, health, transportation, and cities.

To encourage startups to envision innovative ideas, 1776 created the Challenge Cup for the most promising, startups to share their vision on a global stage.

What is the Challenge Cup?

Each year, 1776 hosts a worldwide tournament called the Challenge Cup. Together with partners and over 50 incubator hosts around the world, 1776 will discover the most promising, highly scalable startups that are poised to solve the major challenges of our time.

Startups advance through three rounds: Local, Regional and Global Finals. All of the regional winners and a host of wild cards will be invited to participate in the Challenge Cup Global Finals next June in Washington, D.C. They will compete for over $1 million in prizes, as well as spend time with the investors, customers, media and other key connections that can help them succeed on a global scale.

The power to change the world

From the spark of an idea to the first customer to IPO and beyond, the world’s most progressive startups build and grow their businesses on Amazon Web Services (AWS).

We believe entrepreneurs have the power to change the world, and we are excited to partner with 1776 and support others who are dedicating their entrepreneurial journey to the industries that matter most to our lives — education, energy, health, transportation, food, and more. Throughout the Challenge Cup, we will provide winners of the competitions with AWS credits that can be used on eligible cloud services to help them innovate using cloud technology.

We see major opportunities for tech entrepreneurship, particularly for new businesses that need to be enabled locally, from a technology perspective. At AWS, we are committed to improving tech education around the world and want to continue to fuel talent and trained resources. We need to create the right environment for mentorship, between individuals and between businesses. Together, we can bring the right tools, technology, and training to reinvent the business ecosystem with cloud computing technology that allows for economic growth and world-changing outcomes.

We agree with 1776 that the Challenge Cup is much more than a competition — it’s a movement of startups bringing world-changing ideas to life. By working together, we can unleash the creative power of collaboration and technology.

“Our partners are part of this global convening of entrepreneurs and are an integral part of the Challenge Cup. By making it possible for startups to build solutions with minimal capital costs, Amazon Web Service has been a powerful catalyst to the explosion in startup activity around the world. AWS is committed to supporting startups that are impacting essential human needs and we are thrilled to have them be a part of this year’s tournament as one of our global partners,” Evan Burfield, 1776 co-founder said.

From D.C. to Nairobi to Singapore, we can’t wait to see what ideas these startups from around the world will be bring to the competition. Follow the action at #1776Challenge. Looking to attend an event? Register here.

Landsat on AWS: Half a Year, Half a Billion Requests

A few weeks ago, we had the chance to attend the world’s largest gathering of earth scientists at the American Geophysical Union’s Fall Earth Sciences Conference.

More and more, research in the areas of climate change, agricultural resilience and space exploration rely on access to computing resources in the cloud. Because the cloud makes it easy to share massive amounts of data and allows them to only pay for the computing resources they need, they can accelerate their pace of research while reducing costs.

Last year at the AGU Conference, we announced Landsat on Amazon Web Services (AWS), a service to make Landsat data available for anyone to access from Amazon Simple Storage Service (Amazon S3). Today, over 250,000 Landsat 8 scenes are freely available from Landsat on AWS. All Landsat 8 scenes from 2015 are available, along with a selection of cloud-free scenes from 2013 and 2014. All new Landsat 8 scenes are made available each day (~680 per day), often within hours of production.

Available near on-demand IT resources

Landsat on AWS is designed to allow fast access to Landsat data via a RESTful interface, reducing the time required for analysis. The data shared via Amazon S3 can be transferred programmatically and quickly to AWS cloud computing resources, and researchers can analyze data without needing to download it or store it themselves.

Landsat on AWS makes each band of each Landsat scene available as a stand-alone GeoTIFF, and scene metadata are available as text and JSON files. These individual files allow efficient and targeted data access. Landsat on AWS GeoTIFFs have “internal tiling,” which allows users to use HTTP range GET requests to access 512-pixel squares within each scene. This allows highly targeted access to data based on geography.

Half a year, half a billion requests

Within the first 150 days of the launch of Landsat on AWS (19 March 2015 to 16 August 2015), Landsat imagery and metadata were requested over 500 million times, globally.

The most requested WRF PATH/ROW combination is 040/036, which includes the southern California high desert and the location of the 2015 Lake Fire. The scar of the fire is rust colored in the visualization below, which is based on data acquired on 15 July 2015. This false color composite visualization was made in minutes with Snapsat, a web application built on AWS.

AGU attendees who learned about Landsat on AWS were eager to start using it themselves, sharing it with their students, or in using Amazon S3 to share similar data sets.

Learn how to access Landsat on AWS at http://aws.amazon.com/public-data-sets/landsat/

 

New Tools for Using Real-Time and Archived NEXRAD Weather Data on AWS

In October, we announced that the real-time feed and full historical archive of original resolution (Level II) NEXRAD data is freely available on Amazon Simple Storage Service (Amazon S3) for anyone to use. The Next Generation Weather Radar (NEXRAD) is a network of 160 high-resolution Doppler radar sites that enables severe storm prediction and is used by researchers and commercial enterprises to study and address the impact of weather across multiple sectors.

Early adopters have used the data to cut their product development time and ask new questions about weather-related phenomena. Today, we’re excited to share two new tools that make it even easier for you to analyze NEXRAD data and incorporate it into your workflows.

WeatherPipe for archive analysis

Before NEXRAD on AWS, it was impossible to access the full NEXRAD Level II archive on demand. This limited the types of analysis that researchers could perform with the data. Dr. Michael Baldwin, an Associate Professor in the Department of Earth, Atmospheric, and Planetary Sciences at Purdue University, recalls his difficulties with obtaining funding for NEXRAD-related research because it was deemed “technically impossible” to get enough data to perform the analysis.

He said, “As soon as I heard about NEXRAD on AWS, I got very excited about the impact for science. Having the archive available on demand on AWS opens a new world of possibilities. I’m excited to dust off that proposal and incorporate NEXRAD into my research.”

Baldwin turned to his colleague Stephen Harrell to help make it easier for students and researchers to analyze the NEXRAD data. This led to the development of WeatherPipe, an open source java tool that streamlines the process of running a MapReduce job with NEXRAD data on AWS.

WeatherPipe marshals the NEXRAD data into usable data structures and runs the job in Amazon Elastic MapReduce (EMR). The output is a NetCDF file that you can display in Unidata’s Integrated Data Viewer (IDV) and other visualization tools.

Harrell, who works in Purdue’s research computing office and is completing a degree in the Computer Science department, worked with three classmates (Lala Vaishno De, Hanqi Du, and Xiaoyang Lin) to develop the WeatherPipe prototype in a matter of weeks. They’ve open sourced it to allow anyone to use the tool and contribute to the code.

Currently, the tool produces average radar reflectivity over time. Next, Harrell and Baldwin plan to use the tool to run more advanced and specific analyses, such as storm identification and classification. Ultimately, Baldwin wants to create a predictive model for high-impact weather events, such as tornadoes.

Notifications for event-based processing

For many NEXRAD users, it’s important to get new data as soon as it’s available. This is true for both the “volume scan” archive files (the data collected by the Doppler radar site as it scans the atmosphere) and the “chunks” data (smaller packages of data that are quickly transmitted as a real-time feed).

One of the top requests from early users was for an easier way to incorporate the NEXRAD data into event-driven workflows. Today, we’re excited to announce that notifications are now available for both types of data.

We have set up public Amazon Simple Notification Service (SNS) topics for the “chunks” and archive data that create a notification for every new object added to the Amazon S3 buckets. To start, you can subscribe to these notifications using Amazon Simple Queue Service (SQS) and AWS Lambda. This means you can automatically add new real-time and near-real-time NEXRAD data into a queue or trigger event-based processing if the data meets certain criteria, such as geographic location.

Visit our NEXRAD on AWS page for information on subscribing to these SNS topics and incorporating them into workflows. We’re excited to see what you do with this new capability!

Getting started with NEXRAD on AWS

In addition to these new tools, you can find tutorials from Unidata, The Climate Corporation, and CartoDB on our NEXRAD on AWS page to help you get started using NEXRAD on AWS. Unidata has also made the NEXRAD Level II archive data available via their THREDDS Data Server and you can also browse the archive contents via the AWS JavaScript S3 Explorer.

Educators, researchers, and students can also apply for free AWS credits to take advantage of the utility computing platform offered by AWS, along with public data sets such as NEXRAD on AWS. If you have a research project that could take advantage of NEXRAD on AWS, you can apply for an AWS Grant.

We’d love to feature more tools and stories. Tell us how you’re using the data via the NEXRAD on AWS page!

University of Pennsylvania: 100 Machines for Each Student

The dramatic shift in resources available to students in today’s modern classrooms brings more possibilities in teaching styles. What kind of computer classes did you take while in school? From old-school typing classes to cloud-based learning, students now have access to tools to keep up with the changing technology landscape. To be job ready after graduation, the entrepreneurs, workforce, and researchers of tomorrow are now empowered with access to cloud-related learning endeavors while at school.

One example of bringing the cloud to the classroom is in the teaching style of Zach Ives, professor in the Department of Computer & Information Science at UPenn, in collaboration with Professor Andreas Haeberlen.

UPenn students access cloud resources with help from AWS Educate

The Computer and Information Science department at UPenn offers a wide range of IT courses to undergraduate and graduate students. The school took advantage of resources provided through AWS Educate to give its students access to high-performance computing resources for classes, such as web systems, mobile-game creation, and cloud computing. By accessing AWS services, the students can experience building and deploying large, sophisticated cloud systems, giving them the experience for real-world jobs.

Watch the video here.

For UPenn, AWS Educate has been transformative because it changed the ability to offer assignments and projects that have a large component. Instead of being limited to only one machine per student, Professor Ives was able to take advantage of 100 machines at every student’s disposal to build large-scale courses with substantial projects, such as going from theories like web crawling and distributed systems to implementation. The web service capabilities that AWS offers are powerful, scalable, and cost-efficient. It also allows different students to be isolated from one another, so one student cannot crash all of the services that the other student is using.

Professor Ives built a course around the concept of having students build their own Facebook (for the school) from scratch. Through giving students real-life cloud tools it empowers them to think beyond the textbook and become builders and inventors, using technology that only the world’s biggest companies once had access to. It’s all about re-thinking how schools teach, so that assignments are inspiring and boost student confidence, while getting students job ready.

“Giving my students access to technology is critical to deepening their understanding of cloud computing. To get them to really understand how to do tasks like spin up and down a virtual machine is imperative to giving them the ability and confidence to succeed in their future careers. I’m happy to share my coursework with the AWS Educate educator community, and look forward to learning from fellow computer science professors,” said Zach Ives.

AWS Educate exposes cloud services to academia and also provides a portal for academics to share materials and benefit from others to build more challenging, inspiring, and different course curriculum. With these offerings, students are better prepared for tasks and intellectual challenges that they may face in their life outside of the classroom. The AWS Cloud allows students to take theoretical concepts like complexity, scale, unreliability, and failures and actually study the concepts and deal with these complexities for real. This leads to a sense of pride and understanding on how things really work.

Access cloud content, training, collaboration tools, and AWS technology at no cost by joining AWS Educate today. Find out about other offerings and how students, educators, and institutions are taking advantage of AWS Educate by following #AWSEducate and @AWS_Edu.

Students Use AWS to Predict Ebola Outbreaks

This past summer, the United States Geospatial Intelligence Foundation hosted the first Geospatial Intelligence (GEOINT) Hackathon. The goal of the hackathon was to bring together and introduce both non-GEOINT and GEOINT-savvy coders and data scientists to interesting problems requiring inventive coding solutions. In addition to enabling participation from the non-GEOINT coding world, the end result was a working code base that performs a specifically requested set of functions or provides answers as outputs.

Students competed along with professional developers and data scientists. The competitors were challenged to predict where Ebola outbreaks might occur and determine why certain areas of West Africa were not affected. The goal was to develop a solution that could be modified to a new set of conditions and be used by other teams.

Out of 30 participants, a student team named “Team Intern” took first place earning a $15,000 award and free admission to the GEOINT conference that occurred in Washington D.C. They developed a predictive analysis model that revealed a likely pathway for Ebola outbreaks. By using an open-source Python library, the team modeled the spread of disease as it’s carried by contagious people through a network of nodes and edges using network theory. Simply put, Team Intern’s library aimed to capture where sick people travel and why.

Problem

To begin, the team looked at what had been done, the data available, and what they could do to fix the problem within 46 hours.

With the spread of disease on the rise, the population of West Africa in danger, and limited hours, the team had to take in data regarding the fatality rate, immunity rate, average travel distance, transmission rate, as well as geo-referenced statistics to determine the virus movement. Then they developed a model that predicts where Ebola will spread and how many people it will affect based on how contagious people travel.

Since disease control measures at water ports and airports are required to prevent the spread of the disease, the only form of travel was by road. However, more data was created by the options available for each contagious person, such as whether they leave or stay home and where they travel— East, West, North, or South. And once they left home, it was assumed they would be more likely to go to highly populated areas near hospitals in cities, thereby infecting more people.

All of this data was used to create a network theory outlining who was susceptible, who was infected, who was recovered, and where they traveled.

Solution

Team Intern turned to AWS to create a model that employed multiple data sources to predict outbreaks and epidemics. The connections between the susceptible and the infected could chart the spread of the disease at each time stamp and how quickly it would spread based on where the contagious people traveled.

The probability density map mapping nodes and edges was able to predict the spread of disease and model the outbreak based on the algorithm created with the data processed.

Hear directly from Team Intern about the problem and their approach to solving it in this on-demand webinar. Watch the webinar here.

Team “Intern”- R. Blair Mason (U.S. Naval Academy ’16), Briana Neuberger (Rochester Institute of Technology ’16), Dan Simon Rochester Institute of Technology ’16 and Paul Warren (Stanford ’17).

Millions of Students in 180 Countries Participate in Hour of Code

During December 7-13, in celebration of Computer Science Education Week, tens of millions of students in 180+ countries participated in Code.org’s “The Hour of Code” powered by Amazon Web Services (AWS). The Hour of Code is a one-hour introduction to computer science, designed to demystify code and show that anybody can learn the basics.

To show support at local schools in the Washington, DC area, AWS had the opportunity to participate in coding events held at Lafayette Elementary School in Washington DC and Patrick Henry Elementary School in Arlington, VA with over 600 students.

Coding in the classroom

With volunteers from AWS, kindergarten through fifth grade students got to participate in an Hour of Code.  AWS volunteers walked the students through coding activities in the form of Minecraft, Star Wars, Frozen, and Angry Birds. Students had the opportunity to code the real-life games they play. They solved puzzle challenges, worked through mazes, built houses, and more— all on the computer. Children of all ages were matched with activities for their age and skill-level, and at the end of the hour session they were able to explain what coding is and how it is used.

As volunteers, we had the opportunity to see the kids’ faces light up as they completed challenges they once thought were too difficult. Seeing them overcome the obstacles and get excited about coding showed how coding can be fun and kids of all ages can do it.

Students left energized, inspired, and wanting to code more, at home and at school. One of the goals of The Hour of Code is to introduce coding to students at a young age and give them the resources needed to learn how to code. By helping students as young as five understand what coding is, the real-life use cases, and how to get started, it encourages them to engage in technology and potentially consider careers in technology. With more resources made available to the students, the more they want to continue to learn.

The Hour of Code is geared toward young people, and school-aged children, but anyone can participate and begin to learn the coding basics. Volunteers, teachers, and students all got involved during the day.

Scaling to meet demand

The events at Lafayette and Patrick Henry Elementary Schools were only two of the thousands of sites participating around the globe. Code.org needed the ability to handle extreme spikes in traffic as students logged on and participated in the activities worldwide. Code.org receives one whole year’s worth of typical traffic in one month. They needed to scale to meet this kind of demand, so their site could withstand that kind of surge in usage reaching tens of millions of students.

Using Amazon CloudFront, Code.org hosts the tutorials all over the world with the help of the AWS global infrastructure and multiple regions to plan for redundancy and high availability. So wherever you are in the world and no matter how many people are on the site, you can access the tutorials quickly and more responsively.

Start coding today!

Welcome to the AWS Innovating in the Public Sector Blog

A post by Teresa Carlson, Vice President of Amazon Web Services Worldwide Public Sector

 

I am thrilled to welcome you to Amazon Web Services’ “Innovating in the Public Sector” blog! This blog is all about you, our public sector customers, and will be your place for all public sector-specific content.

I’ve had the pleasure of traveling to many parts of the world – from Singapore to Bahrain to Sao Paulo to all over the U.S – to meet with local and national government agencies, educational institutions, and nonprofit organizations. I kept hearing a lot of the same challenges, needs, and best practices that could be leveraged all over the world, so we decided to dedicate this space to sharing what we’ve learned with you.

Government, education, and nonprofits are faced with unique challenges, requirements, and missions. We want to bring you the latest content on the topics that matter most to you. From security to criminal justice to educational research, each week our dedicated team covering national, regional, and local government, education, and nonprofits will share insights with you.

By using the cloud, our customers are paving the way for innovation and making the world a better place. Some examples include:

Cloud computing changing the game in the public sector

Cloud is becoming the new normal around the world. Currently we have 2,000+ government agencies, 5,000+ educational institutions, and 17,500+ nonprofit organizations changing the game with the AWS Cloud.

Whether it is for development and testing, enterprise applications, high-performance computing, storage, disaster recovery, web, mobile, and social apps, virtual desktops, or data center migrations, government agencies, education institutions, and nonprofits are using AWS to help achieve their missions.

Also, we have a strong and growing community of partner companies that offer a wide range of products and services on the AWS platform to address virtually any use case.

Instead of buying, owning, and maintaining your own data centers and servers, organizations can acquire technology such as compute power, storage, databases, and other services on an as-needed basis (similar to how consumers flip a switch to turn on lights in their home and the power company sends electricity).

Working together to bring you the resources you need, on the topics you care about

Check out the video from our public sector breakfast at this year’s AWS re:Invent where I share the AWS public sector footprint around the world, the partners who make it possible in an industry moving in the right direction for cloud, and the latest customer innovations using these technologies.

From the City on a Cloud, #smartisbeautiful, and the latest data sets released, hear about the innovative ideas driven in the public sector below.

  • #smartisbeautiful. We are working to encourage girls and women to pursue computer science and working with our university partners to create computer science organizations for the women on their campuses.
  • Latest Data Sets. We host a selection of public data sets that demonstrate the power of open data on the cloud. These are accessible at no cost, and drive new businesses, accelerate research, and improve lives. By hosting key public data sets, AWS can enable more innovation, more quickly, creating additional opportunities for public good.
  • City on a Cloud. From healthcare to utilities and from transit to city planning, local and regional governments are embracing innovation.  Take a look at what these agencies and their partners are doing to move government forward on behalf of their citizens.

As we begin to publish content weekly, I am reminded that great projects has to start with great partnerships, so we want to understand how we can better serve you. Reach out to us at aws-wwps-blog@amazon.com with questions or comments.

And receive the latest updates from AWS government and education by following us on Twitter at @AWS_Gov and @AWS_Edu.