Category: Education


Cloud-Enabled Innovation in Personalized Medical Treatment

Hundreds of thousands of organizations around the world have joined AWS and many are using AWS solutions powered by Intel to build their businesses, scale their operations, and harness their technological innovations. We’re excited about our work with the hospitals and research institutions using bioinformatics to achieve major healthcare breakthroughs and unlock the mysteries of the human body.

These organizations are revolutionizing our understanding of disease and developing novel approaches to diagnosis and treatment. A human genome contains a complete copy of the genetic material necessary to build and maintain that organism. The sequencing of this code represents one of history’s largest scientific endeavors—and greatest accomplishments. When the Human Genome Project began in 1990, researchers had only a rudimentary understanding of DNA and the details of the human genome sequence. It took around 13 years and cost roughly $3 billion to sequence the first genome. But today, even small research groups can complete genomic sequencing in a matter of hours at a fraction of that cost.

The parallel evolution of genomics and cloud computing over the past decade has launched a revolution in discovery-based research that is transforming modern medicine. Doctors and researchers are now able to more accurately identify rare inherited and chromosomal disorders, and develop highly personalized treatment plans that reflect the unique genetic makeup of individual patients.

This eBook highlights the important work bioinformatics organizations are undertaking and explains how we are helping them achieve their mission. The stories of these four organizations illustrate what is possible with the AWS Cloud:

  1. The National Institute of Health’s Human Microbiome Project (HMP) – Researchers from all over the globe can now access HMP data through Nephele, an AWS-supported platform, and use that information to identify possible microbial causes of preterm births, diabetes, inflammatory bowel syndrome, and other disorders.
  2. The INOVA Translational Medicine Institute (ITMI) – AWS architecture facilitates the storage and management of this secure data, and enables Inova researchers to develop personalized treatments and predictive care for newborns suffering from congenital disorders and patients of all ages with cancer-causing genetic mutations.
  3. University of California, San Diego’s  Center for Computational Biology & Bioinformatics (CCBB) – CCBB has seven core AWS-supported analysis pipelines, all optimized to handle next-generation sequencing data. Each pipeline is targeted at identifying small but important molecular differences, whether in a tumor’s DNA or in the microbiome, enabling doctors to tailor treatment on an individual level.
  4. GenomeNext – GenomeNext’s AWS based platform represents the newest technological benchmark in the history of genomic analysis, and allows even small research groups to complete genomic sequencing in a matter of hours at a fraction of the traditional cost.

Medical and scientific communities around the world are just starting to take advantage of the transformative opportunities that personalized genomic medicine offers patients. These organizations are at the forefront of that medical revolution. Download the eBook to learn more and check out the infographic below to see how the cloud transforms healthcare.

Mission to Make the World a Safer Place through Crowdsourced Intelligence

LiveSafe, a mobile safety communications platform for crowdsourced intelligence, was born from a spirit of triumph over tragedy and the desire to make the world a safer place. The founding team saw an opportunity to mobilize and connect people through technology. As a victim of assault and a survivor of the Virginia Tech shooting, Shy Pahlevani and Kristina Anderson wondered if they could build tools with the goal of preventing incidents like those they experienced before they occur.

How it works

LiveSafe is putting safety in everyone’s hands to prevent incidents and directly connect people to the help they need. Via the mobile safety app installed on individuals’ smartphones, every submission via text, photo or video is collected with location data to facilitate two-way communication between students and campus security and actionable responses from real-time information. Data can also be submitted anonymously, protecting an individual’s identity while still providing critical information to relevant officials.

Top considerations in a mobile world

To scale rapidly during times of instances, like a security threat on campus or at a stadium, and provide end users with the information to connect, inform, and mobilize, LiveSafe turned to the AWS Cloud.

“When LiveSafe was originally starting up, the team attended an AWS conference and took a hands-on class. Once they saw the ease of use and breadth of services (plus the startup-friendly pricing) they decided it was the right platform for us, and we have been using AWS ever since,” Matt Hagopian, VP of Technology, LiveSafe, said.

When considering the technology needed to run mobile apps, LiveSafe looked at many elements including:

  • Scalability: The ability to test and monitor apps without worrying about provisioning, scaling, and managing the infrastructure was a top priority. Amazon’s Elastic Load Balancing (ELB) helps scale LiveSafe’s API services layer as they see additional demand from end users.
  • Reliability: The app needs to work with users no matter where they are, and on the devices of their choice, without disruption.
  •  Security: The data needs to be secure, and the information shared needs to get into the hands of the right person, securely. LivesSafe utilizes Amazon Virtual Private Cloud (Amazon VPC) to help separate public from private data, as well as production from staging environments. They then make sure to encrypt clients’ data at rest, both in the file system (leveraging Amazon Relational Database Service (Amazon RDS) file level encryption) and at the row level (utilizing AWS Key Management Service (KMS) to help manage their keys external from the encrypted data).
  • User Engagement: Tracking usage and user engagement with push notifications remains a key success factor for the app. When an emergency incident occurs, LiveSafe suddenly needs to send out tens or even hundreds of thousands of communications in an instant. When that happens, they leverage Amazon Elastic Compute Cloud (Amazon EC2) to spin up new machines to deliver those messages as quickly as possible.
  • Low Cost: Paying only for what they use,  they can add, configure, and remove services at any time as needs change, making the cloud the right choice.

“For us, AWS has been an amazing platform as we have grown. We started with only a small subset of services. As we have grown in size and complexity, it has been very easy to leverage other services offered by the AWS platform to extend our offerings. We are also able to leverage resources for doing stress testing on demand or prototyping that we would not have had machines to use in a traditional self-hosted environment,” Matt said.

Whether you are creating a brand new mobile app or adding features to your existing app, AWS Mobile Hub lets you leverage the features, scalability, reliability, and low cost of AWS in minutes.

AWS can help you get started building mobile apps quickly, driving your mission forward in this mobile-first world and have more time to focus on things that make your app great.

Register for the AWS Public Sector Summit to hear how AWS can help you achieve your mission. Register now.

Time to Science, Time to Results: Transforming Research in the Cloud

Scientists, developers, and many other technologists are taking advantage of AWS to perform big data analytics and meet the challenges of the increasing volume, variety, and velocity of digital information.  We sat down with Angel Pizarro, member of the Scientific Computing team at AWS, to talk about how the cloud is transforming genomic research.

Prior to joining AWS, Angel was a bioinformatics researcher (a data scientist focused on biological models and systems). In addition to his own research, Angel ran infrastructure for other researchers at a university. Back in 2006, he had an idea for an experiment, but at first glance it would take more RAM than available on the university’s compute cluster. Upgrading the RAM would have cost more than $40,000 for just this one experiment. They turned to Amazon Simple Storage Service (Amazon S3) and Amazon Elastic Compute Cloud (Amazon EC2) where they could access enough RAM for the short period of time required to test out the idea.

It is a good thing they decided to use AWS, because the experiment didn’t work as expected, but AWS did. The moral of the story is: they tried! AWS allowed them to experiment at a reasonable cost. The ability to experiment became a great driver for change in Angel’s own research, and across the entire genomics field.

“When we calculated the compute and storage that we needed for the sequencers on campus, we found that we only had about 20% of the needed capacity. Second, even if we had unlimited funds to expand the compute and storage infrastructure, we didn’t have the real estate to put the equipment into.” Angel said. “We asked ourselves, ‘what is our real need?’ and the answer was a reactive compute resource that scales based on just-in-time data production. By moving workloads to AWS during peak times, we were able to service our researchers and not slow down the science.”

A lot has changed in the last decade within cloud computing and genomics. The sequencing instrumentation kept improving to output more data at much lower price points. The combination of more sequence at cheaper prices resulted in a virtuous cycle: prices fall, driving more people to use genomics for their research, thus driving more price drops as economies of scale kicked in.

Reducing time to science

The type of questions researchers could ask were largely dependent on the amount of compute they could get their hands on. Prior to the cloud, researchers were limited to three choices when it came to compute-intensive research:

  1. If you had the money to buy big compute clusters, you often had unused infrastructure, which is a waste of money.
  2. If you had no money, you would request access on a shared cluster, often waiting in a large line for the resources to become available.
  3. Or you would just forego the initial question and ask another question.

The cloud breaks this mold by giving immediate and temporary access to an unlimited amount of compute power, and allows you to ask questions that may not have otherwise been possible. And having more computation allows you to ask even better questions about data.

Reducing time to science is something every researcher should experience. Nothing is sweeter than that first moment when you launch a HPC cluster in ten minutes. Once that light-bulb moment happens, you quickly start to realize that you can launch many clusters and perform parallel analyses of the same data set.

The second part of accelerating science is sharing results. In the cloud, everyone is able to use the same tools, language, and security that you did. More than just sharing a manuscript or a script to go along with your data, virtual infrastructure allows you to share the code that created your entire environment. If you have ever tried to install and use someone else’s badly documented code, you know what a big a deal this is.

Another goal of this approach is to democratize science by putting petabyte-scale data sets and 10,000 core clusters within the reach of researchers at institutions that may not be able to afford to buy something for local installation. When you can temporarily utilize massive amounts of compute, you lower the bar of entry for researchers.

Science shared securely

Within the scientific community, security is discussed in the context of data security, as in who has access to it and when. With AWS, you are able to provide standard operating procedures (SOPs) and share them with other researchers. There’s also a template so you and other researchers can meet these controls. That’s a powerful model – you have guidance and provide the steps.

Sharing findings allows researchers to rely on more data to help get to where they want to be in their own research. For example, researchers at Johns Hopkins University are developing a new algorithm on top of Amazon Elastic Map Reduce (EMR) to analyze all public RNA-Seq data in public repositories. The system actually gets cheaper the more data you give it. It works directly off of Amazon S3 to read input data, store results, and takes advantage of the Amazon EC2 Spot pricing. Amazon EC2 Spot instances allow you to bid on spare Amazon EC2 computing capacity, significantly reducing the cost of running your applications, growing your application’s compute capacity and throughput for the same budget, and enabling new types of cloud computing applications. By being able to analyze all of the public data at a reasonable cost, Johns Hopkins found new insights into how genes are spliced together, resulting in the formation of proteins and cells. They discovered evidence for over 58,000 new pieces to that enormous jigsaw puzzle, our body, all without ever having to worry about the size of the infrastructure. They just needed to ask their big research question and access pay-as-you-go infrastructure to answer it.

“There is a large consortium of data, because the human body is complex. But what we know about  human biology is low hanging fruit. There is so much more out there, if we can share the data we have across different groups. The hope of the cloud model is to really start understanding human biology and make strides in research that impacts the world,” Angel said.

Learn more about AWS and genomics in this post by Angel and Jessica Beegle on How the Healthcare of Tomorrow is Being Delivered Today and visit the AWS Genomics in the Cloud page.

Going “All-In” on AWS: Lessons Learned from Cloud Pioneers

Increasingly, customers across the public sector are going “all-in” on the AWS Cloud. Instead of asking whether to go all-in, these pioneers asked how quickly they could do it.  Balancing the need to improve efficiency, stay agile, and meet mandates, government and education customers are committing to going all-in on AWS, meaning they have declared that AWS is their strategic cloud platform.

At last year’s re:Invent, we were lucky enough to hear from Mike Chapple, Notre Dame; VJ Rao, National Democratic Institute; Eric Geiger, Federal Home Loan Bank of Chicago; and Roland Oberdorfer, Singapore Post eCommerce on a panel sharing insights into their decision-making about moving all-in or cloud-first, the value they’ve seen, and the impact to the mission.

Success can be contagious

All of these organizations are at various stages on their journey to the cloud. For example, Notre Dame is a year in with a third of their services migrated, whereas, Federal Home Loan Bank is three years in and recently unplugged their last piece of on premises infrastructure. No matter the stage, they all have similar experiences, lessons learned, and a shared goal—the cloud.

After initial successes with pilot projects, such as websites or testing environments, IT teams within these organizations saw the possibilities and savings with AWS and decided to migrate more of their infrastructure to AWS. Whether it was cost savings or scalability, these quick wins showed business value and a compelling case to bring other services to the cloud.

“Look for things that are as straightforward as possible to guarantee success,” advised Mike Chapple, Sr. Director for IT Service Delivery, Notre Dame.

The feeling of success can be contagious, and because of the initial success, each of these organizations wanted to do more and more. They took the time to carefully and thoughtfully design their infrastructure or “data center in the cloud” with an AWS Solutions Architect. Getting serious from the start paid off in the long run.

They may have begun the journey by wanting to lower costs, but they continued on the journey leveraging the cloud because of the possibilities available. No longer are they constricted by budgets, scale, and compute.

Tidbits of advice on the journey

Since adopting the all-in strategy, these organizations are now realizing what is possible with the power of the cloud. But gaining buy-in was not always easy. The panel mentioned they could prove security, they can encrypt data in flight and they can encrypt at rest, but surprisingly, the biggest push back came from their own staff.

With some universities and business, tradition runs deep, and that was the case with Notre Dame, a 175-year-old institution. So going all-in on AWS required more than just initial success with a few little projects. It required storytelling, training, and education. “One of the things we’ve learned along the way is the culture change that is needed to bring people along on that cloud journey and really transforming the organization, not only convincing that the technology is the right way to go, but winning over the hearts and minds of the team to completely change direction,” Mike Chapple said.

Change happens and the cloud is the natural evolution of IT. These teams did a lot of storytelling and mapping out that this is the next logical step to move from a virtualized on premises environment to a virtualized environment in the cloud. They planned early, trusted their instincts, and told the cloud story.

Watch this panel discussion and don’t miss out on the chance to hear from some other customers who have gone all-in with AWS at one of our upcoming Summits in Chicago, New York, Washington DC, and Santa Clara. Learn more about these events here and register for the AWS Public Sector Summit on June 20-21 here.

The Art of the Possible: the Power of Data and Analytics in Education

A guest post by Darren Catalano, CEO of HelioCampus and former vice president of analytics at University of Maryland University College (UMUC).


 

Higher Education is at a tipping point. External factors, including an increased focus on student success and decreased revenue from previously reliable sources, are putting more pressure on institutions to become more efficient and show better results. Institutions cannot remain idle as the landscape changes. Instead, innovative universities are using analytics as a strategic enabler to transform the university and to change its culture.  If you enjoy tackling big challenges, you are in the right place!

Whether you are a Chief Data Officer, Institutional Researcher, or BI Developer, we must seize the opportunity to transform our role on campus. The onus is on us to make a compelling argument that we are part of the solution by highlighting our capabilities and showing the university “the art of the possible” when it comes to unlocking the value in our institutional data.

Transparency promotes accountability

In order to facilitate meaningful conversations and to elevate our role, we must be much more proactive and engage the university community in a significantly different way. Before we would ask: what are your requirements? What do you want? And then build, test, and release. Those days are in the past. We can no longer show up to meetings with a blank sheet of paper. Now, we need to show what could be. To achieve this, we need to rapidly deploy models that can provide an immediate impact.

The cloud allows universities to cost effectively and securely host, process, and deliver data analytics services. Institutions should focus on building a data platform that connects disparate data from sources across the university enterprise and transforms the information into flexible data models. These models accelerate the ability to prototype and quickly answer ad hoc requests. We must focus on providing easy access to the modeled data by delivering a service catalog that includes easy-to-understand dashboards, predictive applications, forecast models, and operational reporting.

But how do we get to this point where data is critical in all decisions around student enrollment, engagement, and success?

Key to overcoming this challenge is demonstrating the potential – the “art of the possible” to university stakeholders in order to demonstrate the benefits of having a unified data layer. For example, by combining data sets, we can analyze transfer, retention, and graduation rates in comparison with admissions data to see differences in profiles; combine prospective student and pre-enrollment data with retention data to spot significant retention impacting variables; and look at first-term class registration patterns to determine the impact on course success.

Demonstrating the potential

Analytics in higher education have never been more important and those institutions that thrive will use their data as a competitive advantage. Cultural change does not happen by accident but rather it is the result of a consistent intentional effort. In order to facilitate cultural change on campus, follow these five lessons learned:

  1. Invest in a solution
  2. Organize for performance
  3. Empower leaders to use data
  4. Embrace transparency
  5. Highlight success

Our job is to take the complexity out of the data and present it in an easily understood and consumable fashion. Data has the ability to make transformational changes within an institution. For a practical example of how a large university used data and analytics to improve their student success, read the blog post highlighting UMUC.

AWS Offers Data Egress Discount to Researchers

AWS Makes Cloud and HPC Budgeting More Predictable for Scientists

The pace of research is no longer limited by the availability of computing resources. Researchers are beginning to rely on cloud computing to drive breakthrough science at breakneck speeds and AWS wants to fuel the pace of new discoveries by making it possible for all scientists to have their very own supercomputers in the cloud.

Today, AWS committed to making it easier for scientists to use its cloud storage, computing, and database services by waiving data egress fees for qualified researchers and academic customers; these are fees associated with “data transfer out from AWS to the Internet.” The maximum discount is 15% of total monthly spending on AWS services, which is several times the usage we typically see among our research customers. However, there is no cost to upload data into AWS, or move data between Amazon Simple Storage Service (Amazon S3) and Amazon Elastic Compute Cloud (Amazon EC2).

The agreement has been supported through ongoing discussions with Jisc in the UK, GÉANT in Europe, and DLT in the United States, which provide network infrastructure and supporting cloud services to education and research institutions around the world.

“Having predictability and stability in costs is one of the major challenges for researchers in adopting cloud services, so it’s welcome news that AWS is removing egress charges for academic customers. There’s a real opportunity here for cloud computing to become as ubiquitous to research as it is in the commercial market, and with it bring a massive boon to the sector, supporting more efficient, collaborative and innovative research outputs,” said Dan Perry, director of product and marketing at Jisc.

Professor Tony Hey, chief data scientist for the Science & Technology Facilities Council (STFC), said, “I am delighted that AWS is taking this step to remove uncertainty about egress charging for research use of their cloud infrastructure, following extensive discussions with Jisc and GÉANT. I often hear from researchers that the perception that they will receive large bills for data downloads has discouraged them from considering commercial cloud providers for their compute and data requirements. The cloud has a huge amount to offer in terms of agility and efficiency gains, and also unique capabilities in areas such as machine learning. This is a very welcome development from AWS, and I hope that other cloud providers will move swiftly to follow suit.”

By reducing data egress fees, AWS will to help scientists launch their first computing machine in minutes, analyze data pipelines, and store petabytes of data in the cloud, ultimately accelerating time-to-science.

AWS customers are eligible for waiver of egress charges under this program if they:

  • Work in academic or research institutions.
  • Run any research workloads or academic workloads.  However, a few data-egress-as-a-service type applications are not allowed under this program, such as massively online open courseware (MOOC), media streaming services, and commercial, non-academic web hosting (web hosting that is part of the normal workings of a university is allowed, like departmental websites or enterprise workloads).
  • Route at least 80% of their Data Egress out of the AWS Cloud through an approved National Research and Education (NREN) network, such as Internet2, ESnet, GÉANT, Janet, SingAREN, SINET, AARNet, and CANARIE. Most research institutions use these government-funded, dedicated networks to connect to AWS, while realizing higher network performance, better bandwidth, and stability.
  • Use institutional e-mail addresses for AWS accounts.
  • Work in an approved AWS Region.

Request the AWS Data Egress Waiver: Contact your AWS Account Manager or use the “Contact Us” form and mention “Data Egress Waiver Request.”

Scientists can get started with a Free Tier Account: www.aws.amazon.com/free.

Learn more about Scientific Computing on AWS at: www.aws.amazon.com/scico.

 

Save the Date for the AWS Public Sector Summit in Washington DC- June 20-21, 2016

We are excited to announce our seventh annual AWS Public Sector Summit scheduled for June 20-21, 2016 in Washington, DC.

Join us for one of the largest gatherings of government, education and nonprofit technology leaders sharing their firsthand stories of innovation for the public good. Last year’s summit featured a star line up including the CIOs of the US, UK, Canada, and Singapore, as well as IT leaders from agencies and organizations near and far.

This year, we are excited to welcome Andy Jassy, Amazon SVP and the visionary leader of Amazon Web Services, along with even more customers innovating in mobility, the Internet of Things, scientific computing, advanced security, open data, and more. The Summit includes over 80 breakout sessions, direct access to AWS technologists and inspiring customer spotlights.

Check out this video to get a flavor for the event.

Register now for the AWS Public Sector Summit here!

AWS Educate Hosts Cloud in the Classroom Day

Over 50 AWS volunteers were excited to collaborate with the Clark County School District (CCSD) on our inaugural AWS Educate “Cloud in the Classroom Day” earlier this month. We worked with 68 students on sessions such as Gaming in the Cloud, Alexa Voice Skills, and Cloud Careers at Rancho High School and Cimarron-Memorial High School in order to support the workforce and economic development of the community around Las Vegas, NV. Following the workshops, the AWS Educate team met with CCSD faculty and staff to brainstorm ideas for bringing cloud into high school curriculum. And to cap off the activities, Kylie Pratt, high school senior and president of the globally-ranked Cimarron Robotics Team, and her coach, Jenny Stensrud, joined Teresa Carlson and Helen Sun, VP Motorola Solutions, to provide insights on girls, technology, and the high school curriculum.

Highlights from the Cloud in the Classroom

  • 30 students at Rancho High School customized a Jumpy Fish Game running on AWS by adding new graphics, hacking the leaderboard, and setting up new top score notifications using Amazon Simple Storage Service (Amazon S3), Amazon DynamoDB, and AWS Lambda.
  • 38 students at Cimarron-Memorial High School developed new voice skills for Alexa by creating a Lambda function in the AWS console and a new voice skill with the Alexa Skills Kit (ASK) in the Amazon Developer Portal.
  • After the technical workshops, students from both schools learned about careers in cloud and got a chance to quiz Amazonians on what they do and why they like working at Amazon.
  • AWS Educate facilitated a roundtable with 20 CCSD faculty and staff led by Gia Marie Moore, Director, Magnet Schools and Career and Technical Academies, CCSD, and Dr. Jesse Welsh, Academic Manager of Innovative Learning Environments, CCSD, for an initial discussion of the cloud, preparing students for the cloud workforce, and the best path to integrate the cloud into career and technical education (CTE) curricula for CCSD. This includes targeting existing classes, developing a multi-course curriculum, providing professional development, and integrating free cloud “sandboxes” for students into the project-based learning environment.

Rancho High School’s “Gaming in the Cloud” workshop

The workshop used a modified version of Jumpy Fish. The students logged in to the AWS console, used S3 to upload a new sprite for the game, used DynamoDB to hack the leaderboard and insert a new high score for the game, and add a function call (pre-configured) in a Lambda to send out a text message notification of a new high score. The students customized the game images with clever graphics and inserted whimsy into the leaderboard. Two students worked in a separate exercise to “hack” the game. Following the workshop, students attended a “Cloud Careers” session.

Cimarron-Memorial High School’s “Create Your Own Voice  Skill’ workshop

This workshop used Alexa and Lambda with an Amazon Echo. Students explored the AWS console, used S3 to download files for the labs, created a Lambda function, explored the Amazon Developer Portal, used the ASK to configure their skill and integrate with the Lambda function, and tested their skills on an Echo. AWS volunteers circulated throughout the room to encourage, help, and cheer as each group got to test their working skills on the Echos in the room. Students were engaged and excited to create and test skills, including queries on the best gaming platform, favorite restaurant, favorite color, and worst NFL team in 2015. Following the workshop, students split into small groups and Amazon volunteers shared their experiences, answered questions, and discussed career pathways.

Learn more about AWS Educate and how to bring the cloud to your classroom here.

A Practical Guide to Cloud Migration

To achieve full benefits of moving applications to the AWS platform, it is critical to design a cloud migration model that delivers optimal cost efficiency. This includes establishing a compelling business case, acquiring new skills within the IT organization, implementing new business processes, and defining the application migration methodology to transform your business model from a traditional on-premises computing platform to a cloud infrastructure.

A Practical Guide to Cloud Migration: Migrating Services to AWS white paper coauthored by AWS’s Blake Chism and Carina Veksler provides a high-level overview of the cloud migration process and is a great first read for customers who are thinking about cloud adoption.

The path to the cloud is a journey to business results. AWS has helped hundreds of customers, such as City of McKinney, TX and Georgetown University, achieve their business goals at every stage of their journey. While every organization’s path will be unique, there are common patterns, approaches, and best practices that can be implemented to streamline the process.

  1. Define your approach to cloud computing from business case to strategy and change management to technology.
  2. Build a solid foundation for your enterprise workloads on AWS by assessing and validating your application portfolio, and integrating your unique IT environment with solutions based on AWS cloud services.
  3. Design and optimize your business applications to be cloud-aware, taking direct advantage of the benefits of AWS services.
  4. Meet your internal and external compliance requirements by developing and implementing automated security policies and controls based on proven, validated designs.

Early planning, communication, and buy-in are essential. Understanding the forcing function (time, cost, and availability) is key and will be different for each organization. When defining the migration model, organizations must have a clear strategy, map out a realistic project timeline, and limit the number of variables and dependencies for transitioning on-premises applications to the cloud. Throughout the project, build momentum with key constituents with regular meetings to review progress and status of the migration project to keep people enthused, while also setting realistic expectations about the availability time frame.

Learn more about what it takes to migrate to the cloud in this guide here.

Gibraltar Area Schools: The Leap to the Cloud Saves 50% in 5 Years

Gibraltar Area Schools, a rural school district serving 600 K-12 children located on the district campus in Fish Creek, Wisconsin turned to cloud computing five years ago rather than buying onsite servers. This early experimentation and innovation has them reaping the benefits of the cloud.

Time, money, and resources drive early innovation

What are the three main drivers for cloud for K-12 schools?

  1. Time- Schools no longer need to wait for shipping, installation, licensing, housing, and maintenance of physical servers. With the cloud, Gibraltar was able to spin up a SharePoint server on AWS in 20 minutes. And this purchase can be terminated at any time if it no longer fits into the IT strategy.
  2. Money- Most school IT directors and/or network administrators begin their leap to the cloud by using a school credit card and new connection to the cloud provider.  “Buying cloud services can be heaven for an IT person.  I have eight servers running for a few pennies per hour,” Steve Minten, District IT Director, Gibraltar Area Schools said. The cost of physical server hardware can get expensive when you have to pay all of that up front, but Gibraltar Area Schools have not spent half of what they would on physical services over the past five years with AWS.
  3. Resources- Gibraltar accesses prepackaged server configurations and software that they didn’t have before.  Some of the newest network devices are available as a virtual network device on AWS.  With cloud computing, it’s more like a school in a software box.  For example, the Gibraltar Schools are geographically remote on a peninsula extending far into Lake Michigan.  This means that service, repair, and technical assistance are more than 80 miles away.  Cloud hosting places their schools inside the “technology box” rather than its historical existence on the far fringe of the region’s technical world. “In a perfect world or in this case a new school, cloud deployment would be ideal.  It would truly be a ‘High School-in-a-Box’ scenario.  Just picture one fiber connection to a building and running through a switch and to wireless access points. That’s it.  No domain servers, databases, communication, backup or web filtering.  Nothing in house – just glass and wifi,” Minten said.

Cloud, now what?

Most schools need a backup and file system, a student management system, databases, and communication systems. Once a school has decided to host enterprise software in the cloud, they should determine the source and purpose of the traffic.  Is it student-based?  Does the traffic require internal or external access or both?  Is there a concern for sensitive information? The cloud allows these services to be used on a school’s network domain through a Virtual Private Connection (VPC) via AWS’s infrastructure. This first step will open the door for the IT team to experiment.

“Have the school’s IT person setup a VPC and start playing around.  The light bulb will turn on right after they realize they can run anything they want – from domain servers to web filters in their cloud for pennies per hour,” Minten said.

With the AWS Cloud, schools, districts, and companies providing educational applications can access industry-shaping technology at an affordable cost, no matter what the scale.

Learn more.