Category: government
New Transport for London Open Data Sets Available
Do you use cloud computing to help your city? The 2016 AWS City of a Cloud Innovation Challenge recognizes cities driving technology solutions to help improve citizens’ lives. Apply today to win $50,000 in AWS promotional credits. The deadline is just two weeks away on May 13th, and winners will be announced at the AWS Public Sector Summit in Washington, DC. To get ideas and inspiration, check out this story about how the Transport for London (TfL) is using cloud to benefit London commuters.
Transport for London (TfL) has announced plans for a range of new open data feeds via its Unified API, providing a source for developers to work on additional travel information apps for the city. This transportation data holds more value than just charting the ebb and flow of a morning commute. By understanding crowding data, traffic signals data, tram data, and journey data for the Tube and buses, startups and developers can work to solve big picture challenges and create smarter cities. For example, by knowing traffic patterns or the busiest tube stations, cities can identify where hot spots are and work to alleviate delays. This type of data allows for innovation in smart cities in areas such as transport, safety, health, and beyond. It also allows communities of startups to make use of data and create the products and services customers want to use.
At the AWS London Loft, Transport for London (TfL) discussed four potential new public data sets on the AWS Cloud:
- Crowding data –TfL is currently looking at releasing historical crowding data about key London Underground stations – which it hopes will help stimulate new information products, including predictive status of the system at certain times.
- Traffic signals data – every day, TfL receives the equivalent of three DVDs worth of traffic data through its sophisticated traffic signal system. TfL hopes that by releasing more of this data to developers, it could be more widely used for other purposes, such as identifying incidents and traffic hot spots more quickly.
- Tram data – last week, TfL released live data for the London Trams network in the south of the city, so people can obtain real-time arrivals information at tfl.gov.uk and in transport apps for the first time.
- Ticketing data from Tubes and buses – every day, millions of people travel across London using a range of modes, making different journeys depending on what they are doing. By sharing completely anonymous and aggregated ticketing data, it’s hoped that this rich source of information can be combined with data feeds from other organisations to spot trends and behavioural patterns both at a local and macro level.
In addition to new feeds, TfL talked about what other work they are carrying out to make their data more available to developers. Recently, TfL experimented with a new data feed covering planned road closures for the London Marathon, which took place on Sunday 24 April, for traffic event planning. Following this trial, TfL will look to engage with other sat-nav organisations to see how this data format could be more widely used to bring better information to road users.
Government agencies are seeking to drive usage of their data, like the data sets released by TfL. When data is shared on AWS, it is readily available to more businesses, entrepreneurs, and researchers than ever before. Through anonymized Oyster and Contactless card journey data, TfL’s in-house analysts are already working to help the city understand how people travel around the capital and use this to keep London moving.
TfL uses AWS – both to improve the agility and responsiveness of its own digital services, and to establish a third-party developer ecosystem (primarily of tech start-ups) that can leverage its openly-licensed travel data stream to build their own travel-planning applications for Londoners.
It has also reported that more than 2,000 developers have signed up to use its open data feeds over the past six months, bringing the total to 8,200, and that nearly 500 smartphone apps have been developed from the sources.
Phil Young, Head of TfL Online, opened the AWS Loft event by explaining that “our free open data now powers London, enabling better informed journeys than ever before and offering the chance to address the key challenges of our city, while supporting hundreds of technology enterprises. The reach of our information continues to grow, with 42% of Londoners using apps powered by our data, alongside growing usage of our website.”
The event saw experts from TfL discuss the recent and upcoming additions to its Unified API alongside other new data sets which could be released in the future.
Organizations around the globe, like TfL, are increasingly making their data open and available for the public to access and use in the AWS Cloud. This is fueling entrepreneurship, accelerating scientific discovery, and creating efficiencies across industries. AWS offers a comprehensive tool kit that enables sharing and analysis of data at any scale. Through the AWS Public Data Set program, we have identified methods to share massive data sets.
See below for pictures from the event and read TfL’s blog on the event here.
A Guide to Cloud Success: Understand the AWS Cloud Adoption Framework
We are pleased to share our “Guide to Cloud Success” training brought to you by the GovLoop Academy. Through this 30-minute interactive course, you will learn about the AWS Cloud Adoption Framework (CAF), a process designed to help organizations develop an effective strategy for their cloud adoption journey.
“In order to accelerate the rate of adoption of cloud services, to realize a reduced time to value, it’s important that customers have a conceptual idea of what decisions and what steps they need to make, and then also a sequence for how those decisions need to be made, when they need to be made, and what the outcomes are,” Blake Chism, Practice Manager, AWS Public Sector said. The course can be accessed here.
Receive more guidance about how to choose the best cloud technology for you, and how to create a roadmap for AWS Cloud adoption with an actionable plan to take back to your agency. After completing this learning experience, you’ll be able to discuss the relevance of the AWS Cloud to your agency’s mission, as well as how to use the AWS Cloud Adoption Framework and understand how to best use it.
This course includes the following lessons:
- Lesson 1: Benefits of Cloud Computing
- Lesson 2: What is the AWS Cloud Adoption Framework?
- Lesson 3: The 7 Perspectives of the AWS Cloud Adoption Framework
- Lesson 4: Creating a Roadmap for Cloud Adoption
- Lesson 5: Build Your AWS Cloud Adoption Framework
Register for the course here. And for more information on AWS cloud adoption, check out these additional resources:
- An Overview of the AWS Cloud Adoption Framework
- Get Started with the AWS Cloud Adoption Framework
- A Practical Guide to Cloud Migration
Last Call: City on a Cloud Innovation Challenge – Deadline May 13th
The deadline is fast approaching to submit your entries for the City on a Cloud Innovation Challenge! We are looking for examples from local and regional governments and technology partners from around the world that are innovating for the benefit of citizens using the AWS Cloud.
From transportation to planning to utilities, governments are using cloud computing to transform the way they interact with citizens and think about their future. Learn what is possible from the last two years— 2014 winners and 2015 winners. We recognize local and regional governments as hubs of innovation in three categories:
- Best Practices – The Best Practices Award will be granted to a local or regional government leader who has deployed an innovative solution to solve a government challenge.
- Partners in Innovation – The Partners in Innovation Award will be granted to a technology partner who has deployed an innovative solution to solve a government challenge.
- Dream Big – If you have a great idea that you would like to implement in your city, we encourage you to submit it in the Dream Big Award Category.
AWS helps local and regional governments innovate by simplifying IT workloads that governments struggle with and depend on every day, such as Geographical Information Systems (GIS), Content Management Systems (CMS), Open Data portals, and more. All of these applications run on AWS and make it easier for governments to deliver services to their citizens.
Winners will receive AWS promotional credits to start or continue their projects. We will be announcing the winners of the challenge at the AWS Public Sector Summit on June 20-21 in Washington, DC.
Read more about the City on a Cloud Innovation Challenge here and read the complete eligibility requirements and rules here.
The Evolution of High Performance Computing
A guest blog by Jeff Layton, Principal Tech, AWS Public Sector
The High Performance Computing (HPC) world is evolving rapidly. New workloads, such as pattern recognition, speech, video, and text processing, speech and facial recognition, deep learning, machine learning, and genomic sequencing, are being executed on HPC systems. The main motivation behind this evolution is economic and technical. As HPC systems became more powerful, agile, and less costly, they can be used for applications that have never had access to high scale, low cost infrastructure.
The cloud has accelerated this evolution because it is scalable and elastic, allowing self-service provisioning of one to thousands of processors in minutes. As a result, HPC users are coming to AWS with new and expanding application requirements and are seeing reduced time-to-results, faster speed to deployment, greater architectural flexibility, and reduced costs. Cloud computing is pushing HPC at the pace of computing innovation as users benefit from advances in microprocessors, GPUs, networking, and storage.
The cloud and the evolving HPC world
The HPC world has a need for more processing capability, which is driving HPC system development. The current HPC architecture, the cluster, was created for a common architecture and operating system that had price-performance benefits far beyond proprietary systems. Clusters with commodity processors were then doing production work for a number of companies and labs, which led to the explosion of clusters in HPC.
Clusters have come a long way and have greatly increased access to HPC resources at an affordable price. This includes both embarrassingly parallel applications and tightly coupled applications.
Issues with traditional HPC fixed architectures
The HPC cluster architecture is a relatively fixed architecture with a set of servers (nodes). Each server has a small amount of internal storage (if any at all), connected by a dedicated network, using software tools to manage user requests for resources. It is rare for any changes to be made to the system, such as adding nodes, processor upgrades, additional node storage, network topology, or technology changes. Once put in place, the vast majority of dedicated cluster systems never change architecture.
The rise of the Hadoop architecture, which addresses a large class of HPC problems, makes this inflexibility an even greater challenge. The Hadoop architecture (also known as the Map-Reduce architecture) calls for nodes with a lot of local storage and only uses TCP networks. The typical on-premises HPC system uses the smallest, least expensive, but reliable drive in each node. For Hadoop workloads, customers often procure a separate system specifically designed for Hadoop workloads. Employing this strategy would create two HPC architectures with conflicting configurations. However, this is unnecessary when cloud computing is the platform, as both rely on commodity systems, dynamically created clusters, and software stacks that are purpose-built for the needs of particular problems.
The cloud allows you to go beyond thinking that HPC is only about clusters and that all applications must adapt that model. If you have a new architecture in mind for your application or your workflow, you can simply and easily create it in the cloud.
Do you want to use a combination of containers and microservices for your application? The AWS Cloud allows you to construct what you need with some very simple code. If the architecture doesn’t work as well as you wanted, then you just turn off the system and stop paying for it.
Learn more about HPC with AWS in this video below.
In future blogs, I’ll discuss some of the pain points of HPC beyond architectural rigidity and how the cloud addresses them. Stay tuned! In the meantime, learn more about HPC with AWS here: https://aws.amazon.com/hpc/
Cloud-Enabled Innovation in Personalized Medical Treatment
Hundreds of thousands of organizations around the world have joined AWS and many are using AWS solutions powered by Intel to build their businesses, scale their operations, and harness their technological innovations. We’re excited about our work with the hospitals and research institutions using bioinformatics to achieve major healthcare breakthroughs and unlock the mysteries of the human body.
These organizations are revolutionizing our understanding of disease and developing novel approaches to diagnosis and treatment. A human genome contains a complete copy of the genetic material necessary to build and maintain that organism. The sequencing of this code represents one of history’s largest scientific endeavors—and greatest accomplishments. When the Human Genome Project began in 1990, researchers had only a rudimentary understanding of DNA and the details of the human genome sequence. It took around 13 years and cost roughly $3 billion to sequence the first genome. But today, even small research groups can complete genomic sequencing in a matter of hours at a fraction of that cost.
The parallel evolution of genomics and cloud computing over the past decade has launched a revolution in discovery-based research that is transforming modern medicine. Doctors and researchers are now able to more accurately identify rare inherited and chromosomal disorders, and develop highly personalized treatment plans that reflect the unique genetic makeup of individual patients.
This eBook highlights the important work bioinformatics organizations are undertaking and explains how we are helping them achieve their mission. The stories of these four organizations illustrate what is possible with the AWS Cloud:
- The National Institute of Health’s Human Microbiome Project (HMP) – Researchers from all over the globe can now access HMP data through Nephele, an AWS-supported platform, and use that information to identify possible microbial causes of preterm births, diabetes, inflammatory bowel syndrome, and other disorders.
- The INOVA Translational Medicine Institute (ITMI) – AWS architecture facilitates the storage and management of this secure data, and enables Inova researchers to develop personalized treatments and predictive care for newborns suffering from congenital disorders and patients of all ages with cancer-causing genetic mutations.
- University of California, San Diego’s Center for Computational Biology & Bioinformatics (CCBB) – CCBB has seven core AWS-supported analysis pipelines, all optimized to handle next-generation sequencing data. Each pipeline is targeted at identifying small but important molecular differences, whether in a tumor’s DNA or in the microbiome, enabling doctors to tailor treatment on an individual level.
- GenomeNext – GenomeNext’s AWS based platform represents the newest technological benchmark in the history of genomic analysis, and allows even small research groups to complete genomic sequencing in a matter of hours at a fraction of the traditional cost.
Medical and scientific communities around the world are just starting to take advantage of the transformative opportunities that personalized genomic medicine offers patients. These organizations are at the forefront of that medical revolution. Download the eBook to learn more and check out the infographic below to see how the cloud transforms healthcare.
Powering Smart Cities through Connected Devices
This is the second post in a three-part series around smart and collaborative cities, showing how AWS can help cities of all sizes benefit from the cloud. If you missed our first post in the series, Cities of the Future, Today, you can read it here. In this post, we will explore how cloud computing can help cities collect data from both people and devices to provide enhanced citizen services.
Today’s “always on” and “always connected” world can lead to powerful transformations in our cities. These new mobile technologies allow city leaders to gather data in ways they never could before. Citizen data and sensor data can help provide better intelligence, understand citizens’ needs, and ultimately, provide improved services. The cloud can help lower barriers to entry, helping them create new services with the goal of making our lives easier, but also help quickly create low-cost and easy-to-apply services.
Citizen Data
It is inevitable that you will see people glued to their phones. Whether you are at dinner or on your morning commute, instead of getting annoyed, think about how these little computers make life easier. Citizens now have access to mobile applications that allow them to voluntarily provide data to the city by participating in surveys or by using mobile applications. These applications leverage smartphone features, such as accelerometers and GPS functionality, to track the location and movement patterns of their users. And this data can help cities make effective changes.
For example, the City of Boston, with technology partner Connected Bits, has created the Street Bump program to tackle tough local government challenges through innovative, scalable technology. With an app that uses a smartphone’s sensors they are able to capture enough (big) data to identify bumps and disturbances that motorists experience while they drive through the city. The data helps Boston’s Public Works Department better understand roads, streets, and areas that require immediate attention and long term repair.
AWS allows them to create a scalable, open, and robust infrastructure for this information to flow to and from city staff via the Open311 API. This solution developed by nonprofit OpenPlans, was created as a large multi-tenant software-as-a-service platform so other cities can also leverage the same repository, creating one data store for all cities.
GPS is one way to gather data, but another source of data comes from social media streams. City leaders can track whether their citizens are struggling with traffic jams, train delays, or other issues. They can learn about their citizens by analyzing social media, such as the Twitter Firehose, which provides data on public tweets around the world. Every tweet includes location information (if allowed by user), and users can create a geographical bounding box to monitor tweets published in or near to their city.
One common analysis of social media streams is sentiment analysis, a method that uses language processing and text analysis to understand the sentiment of a specific string. This can be beneficial to a city looking to understand how changes are affecting their population and the status of public services. For example, is there a hotspot of citizens complaining about waste collection? The below diagram shows how you could use AWS services to build a real-time social media analysis application.
Using AWS services and social media, the Commonwealth Scientific and Industrial Research Organisation (CSIRO) in Australia, in partnership with the Black Dog Institute, created the We Feel project. This project explores whether social media – specifically Twitter – can provide an accurate, real-time signal of the world’s emotional state. It uses several Amazon Elastic Compute Cloud (Amazon EC2) instances to capture tweets from Twitter’s public API at an average of 19,000 tweets per minute. A separate Amazon EC2 instance processes the tweets, analyzing usernames to determine gender and identifying phrases that reveal emotional content. The information is funneled into an Amazon Kinesis stream, and then the tweets are copied to a scalable Amazon Simple Storage Service (Amazon S3) for storage. The stream is monitored by another Amazon EC2 instance, which produces a summary of results every five minutes and transcribes it in an Amazon DynamoDB database.
Social media and mobility have the power to share useful information to city leaders to help them understand challenges, solutions, and opportunities within their cities.
Sensor Data
Changing cities for the better is, of course, bigger than social media. Internet of Things (IoT) technologies enable new and intelligent ways to collect data across many different industries, including health, transportation, energy and utilities, and agriculture.
AWS IoT is a managed cloud service that provides the ability for ‘things’ to easily and securely interact with cloud services and with other devices, at a massive scale. The connection to the cloud is secure, fast, and lightweight (MQTT or REST), making it a great fit for devices that have limited memory, processing power, or battery life. The AWS IoT service includes a message broker and a rules engine that allows things to interact with other AWS services. AWS IoT also includes “thing shadows,” a service that maintains a current state for each thing you connect. You can use “thing shadows” to get and set the state of a thing over MQTT or HTTP, regardless of whether the thing is connected to the Internet. Learn more about AWS IoT here: https://aws.amazon.com/iot/
By connecting all of the available resources in place throughout cities, authorities can leverage data to make smart decisions and create smart cities! Check back for our next post in this series, where we explore how customers can use big data and analytics to analyze and gain intelligence from data.
Post authored by Steven Bryen, Manager, Solutions Architecture, Engineering, AWS and Giulio Soro, Senior Solutions Architect, AWS
AWS Signs CJIS Addendum with the State of Minnesota
Amazon Web Services (AWS) is pleased to announce that we recently signed a Criminal Justice Information Services (CJIS) Security Addendum with the State of Minnesota, making AWS GovCloud (US) services available to law enforcement customers through the state’s approved service catalog. The Bureau of Criminal Apprehension’s (BCA) vetting process of AWS is underway, including performing any required employee background checks as prescribed by the FBI’s CJIS standard.
With a constant flow of new requirements and new business drivers across the law enforcement community, the state of Minnesota can deploy workloads on the AWS GovCloud (US) Region with the goal of reducing IT burdens and delivering services securely, and more efficiently than ever before.
By signing the addendum, this enables local law enforcement agencies in Minnesota to run CJIS workloads on the AWS Cloud, including biometric, identity history, person, organization, property, and case/incident history data, with confidence that they are compliant with CJIS standards.
Information security is a top priority for law enforcement throughout the country. To satisfy CJIS requirements, and enable extremely high security levels for all of our customers, AWS employs a rich set of security technologies and practices, including encryption and access control features that surpass the capabilities of all other providers. As part of CJIS agreements, we require AWS employees with physical and/or logical access to complete a fingerprint-based background check and security awareness training. Learn more about what makes the AWS Cloud secure and the recognized leader in cloud security by top third-party analysts here.
Partnering for mission success
Working with AWS Partner Network (APN) partners, we are committed to bringing top solutions specifically suited for our customer’s mission needs. For the State of Minnesota, we worked with iCrimeFighter, to provide a mobile forensics software and evidence gathering system, with a mobile app and an AWS Cloud-based evidence tracking capability that was designed and developed by law enforcement officers striving to bring true mobility to the everyday work life of law enforcement.
“Security is a top concern for iCrimefighter, and the AWS Addendum with the State of Minnesota substantiates our commitment to securely store and manage critical digital evidence in the AWS GovCloud (US) for our law enforcement partners in MN—and across the country. By working with AWS, we are able to simplify access to digital evidence by uploading that evidence to a secure cloud,” said Steven London, CEO of iCrimefighter.
Confidence in your data security
These CJIS security addenda give our law enforcement customers, like the State of Minnesota and the California Department of Justice, the confidence that their data will pass CJIS-compliance audits and that their data is secure in the cloud. We comply with the FBI’s CJIS standard, and will continue to sign CJIS security agreements with our customers, including allowing or performing any required employee background checks.
Please reach out if you would like to get started.
U.S. Secretary of Defense Ash Carter Visits Amazon
Earlier this month, U.S. Defense Secretary Ash Carter visited Seattle as part of a West Coast swing aimed at strengthening ties between the Department of Defense (DoD) and the tech community. During his visit, he met with Amazon leaders and spoke with many of our Amazon Military Exchange students, including our Department of Defense Fellow assigned to work with AWS.
While at Amazon headquarters, Secretary Carter sat down with Amazon Founder and CEO, Jeff Bezos, and AWS leadership including Andy Jassy, Charlie Bell, Adam Selipsky, Bob Kimball, Bill Vass, and Teresa Carlson to discuss innovation topics and ways to strengthen military exchange programs and partnerships with the technology industry.
“If we’re going to have the best military in the world, as we must have, 10, 20 years, 30 years from now, we need to strengthen our partnership with companies like Microsoft, Amazon, Boeing and many others that I’ve met out here. I’m determined to do that. It’s part of the responsibility of the department, not only to fight today’s fight but to make sure that we’re superior for tomorrow’s. And the most important ingredient is our wonderful people, but secondly, it’s technology,” Defense Secretary Ash Carter said following his visit to the Amazon Headquarters in Seattle.
Commercial cloud services and other emerging technologies can enable the DoD to achieve its mission, innovate faster, improve agility, and enhance the Department’s security posture, while also cutting costs. In addition to cloud computing, business practices can be shared from industry to military, because talking to others about how they run their business forces us to hold up a mirror to ourselves, which is a very healthy thing for any organization.
One way we are doing this is through the Amazon Military Exchange program and DoD Fellowship designed to expose active duty military to AWS’s technology and Amazon’s leadership principles. Currently, we have three military exchange students and one DoD fellow working at Amazon and AWS. We are able to work alongside our military fellows to learn from them how we can better help the different branches of the military save money, innovate faster, and deliver capabilities that help achieve their mission. Meet some of these military professionals and learn more about this program in our blog series here, here, and here.
We are dedicated to contributing to the military’s mission and the enterprise and appreciate the Defense Secretary’s visit to Amazon. See below for the pictures from his visit to Amazon HQ.

Defense Secretary Ash Carter, center, tours Amazon’s headquarters in Seattle, March 3, 2016. DoD photo by Navy Petty Officer 1st Class Tim D. Godbee

Lt. Col. Maria Schneider, one of the DoD Fellows, meeting and greeting the Secretary of Defense during his visit with the Amazon leadership team.
NASA’s Data is in the Amazon Cloud, Is Yours?
For years, people have been dreaming about going to Mars. Now NASA has made the Red Planet a top priority. While the space agency works on developing the rockets and technologies that would take astronauts further than they have ever traveled before, The Washington Post has created a virtual reality experience to take you there today. Please join us on the journey, which was created using actual imagery from NASA’s rovers. The Post experience is underwritten by AWS and Intel, and features content about our work with NASA Jet Propulsion Laboratory (JPL).
Experience the virtual reality here.
NASA is the foremost space research organization in the world, helping shape our view of the world, space, and the possibilities in between. How did NASA use the cloud to meet its mission in space and on Earth?
- In Space: NASA inspired a new generation of space explorers and citizens by giving them front row seats to space exploration with the live streamed images of the Curiosity’s landing on Mars. NASA JPL serviced hundreds of gigabits/second of traffic from viewers around the world. Once on Mars, the Curiosity Operations Team used the AWS Cloud to scale out and process all incoming data in a matter of minutes. They were able to spend time making scientific progress, without waiting for data processing.
- In the Atmosphere: When NASA faced the challenge of reprocessing petabyte-scale data from the Orbiting Carbon Observatory 2 (OCO-2), they anticipated a 100-day wait and a $200,000 bill using their on-premises data center. With AWS, NASA was able to do the same thing in less than six days for just $7,000. The project helped NASA engineers obtain new insights from their data through the use of new algorithms that adjusted instruments on the satellite, helping them receive richer data on the Earth’s carbon uptake.
- On Earth: NASA continues to leverage the AWS Cloud to study the effects of climate change. Teaming up with Cycle Computing and AWS, NASA is measuring vegetation change in the Sahara at lightning speed. In only six hours, NASA scientists were able to process one third of the data – in a carbon neutral region– for only $80.
Go ahead and explore Mars in the virtual reality experience created by The Washington Post here, and don’t miss out on the “How NASA’s Mars Rover and Earth Analytics Use the Cloud” webinar on April 11th. Register today.
Going “All-In” on AWS: Lessons Learned from Cloud Pioneers
Increasingly, customers across the public sector are going “all-in” on the AWS Cloud. Instead of asking whether to go all-in, these pioneers asked how quickly they could do it. Balancing the need to improve efficiency, stay agile, and meet mandates, government and education customers are committing to going all-in on AWS, meaning they have declared that AWS is their strategic cloud platform.
At last year’s re:Invent, we were lucky enough to hear from Mike Chapple, Notre Dame; VJ Rao, National Democratic Institute; Eric Geiger, Federal Home Loan Bank of Chicago; and Roland Oberdorfer, Singapore Post eCommerce on a panel sharing insights into their decision-making about moving all-in or cloud-first, the value they’ve seen, and the impact to the mission.
Success can be contagious
All of these organizations are at various stages on their journey to the cloud. For example, Notre Dame is a year in with a third of their services migrated, whereas, Federal Home Loan Bank is three years in and recently unplugged their last piece of on premises infrastructure. No matter the stage, they all have similar experiences, lessons learned, and a shared goal—the cloud.
After initial successes with pilot projects, such as websites or testing environments, IT teams within these organizations saw the possibilities and savings with AWS and decided to migrate more of their infrastructure to AWS. Whether it was cost savings or scalability, these quick wins showed business value and a compelling case to bring other services to the cloud.
“Look for things that are as straightforward as possible to guarantee success,” advised Mike Chapple, Sr. Director for IT Service Delivery, Notre Dame.
The feeling of success can be contagious, and because of the initial success, each of these organizations wanted to do more and more. They took the time to carefully and thoughtfully design their infrastructure or “data center in the cloud” with an AWS Solutions Architect. Getting serious from the start paid off in the long run.
They may have begun the journey by wanting to lower costs, but they continued on the journey leveraging the cloud because of the possibilities available. No longer are they constricted by budgets, scale, and compute.
Tidbits of advice on the journey
Since adopting the all-in strategy, these organizations are now realizing what is possible with the power of the cloud. But gaining buy-in was not always easy. The panel mentioned they could prove security, they can encrypt data in flight and they can encrypt at rest, but surprisingly, the biggest push back came from their own staff.
With some universities and business, tradition runs deep, and that was the case with Notre Dame, a 175-year-old institution. So going all-in on AWS required more than just initial success with a few little projects. It required storytelling, training, and education. “One of the things we’ve learned along the way is the culture change that is needed to bring people along on that cloud journey and really transforming the organization, not only convincing that the technology is the right way to go, but winning over the hearts and minds of the team to completely change direction,” Mike Chapple said.
Change happens and the cloud is the natural evolution of IT. These teams did a lot of storytelling and mapping out that this is the next logical step to move from a virtualized on premises environment to a virtualized environment in the cloud. They planned early, trusted their instincts, and told the cloud story.
Watch this panel discussion and don’t miss out on the chance to hear from some other customers who have gone all-in with AWS at one of our upcoming Summits in Chicago, New York, Washington DC, and Santa Clara. Learn more about these events here and register for the AWS Public Sector Summit on June 20-21 here.