Category: Nonprofit

Healthcare and the Cloud: Futureproofing the U.S. Healthcare System

From minor regulatory adjustments to landmark reforms, state and local Health and Human Services (HHS) agencies are constantly adapting to changing requirements to provide vital benefits for the citizens they serve.

Policy changes or initiatives to improve social and clinical outcomes often require a modernization of systems to manage eligibility determination, benefits enrollment, claims adjudication, and other mission-critical processes. Building these systems in the cloud affords organizations increased speed, agility, flexibility, and cost savings that frees up budget to support innovation.

So while perspectives on the best path to an optimized healthcare system may vary, there are important trends to consider as agencies continue to work to deliver a more personalized and digital experience for their citizens.

Important healthcare solutions may lie in disparate datasets.

At the core of these health-policy initiatives is data pulled from countless systems and organizations. Agencies that are able to unlock this data to share within and across organizations realize the most efficient use of resources. Big data and analytics allow HHS agencies to become increasingly collaborative and to generate insights that will identify trends in utilization, quality metrics, and incidents of redundancy. For example, American Heart Association’s Precision Medicine Platform will include a vast array of curated datasets that are centrally stored, easily searched and accessible, and managed on the AWS Cloud. The platform enables researchers and clinicians to aggregate and analyze a rich breadth and depth of data and, in turn, allow researchers to uncover critical cardiovascular disease insights that translate into medical innovations and positively impact millions of lives.

One size does not fit all.

From prenatal disease screening to targeted cancer treatments, precision medicine has the potential to revolutionize healthcare by changing the roles of providers, patients, and payers. Rather than focusing primarily on treating chronic diseases, the unique genetic makeup of each individual will allow for personalized medicine.

Americans are getting older.

The size of our senior population is growing at a faster rate than ever before. Every day, ten thousand Americans celebrate their 65th birthday, and most of them are managing multiple chronic health conditions. While this pace is expected to continue into 2019 and beyond, the pool of healthcare providers is not expanding nearly as quickly. The role of technology will continue to increase as it is leveraged to facilitate care remotely, mitigate impacts of the healthcare provider shortage, and empower our older citizens to live independently in their homes and communities. Once again, agencies who are able to access and analyze datasets in the cloud gather invaluable intelligence on where their resources are most effectively applied.

To be prepared for the uncertainty of changing healthcare laws and regulations, health and human services programs at every level of government will continue to need modern and adaptable technologies to deliver effective services to citizens.

The AWS Cloud works with HHS agencies to provide the flexibility and agility to remain futureproof. Learn more about AWS for Health and Human Services here.

The Intersection of Technology & Social Good: Nonprofits Dedicated to Helping Others

As we all try to keep focused on our New Year’s Resolutions, here’s some inspiration from some of the nonprofits who use the AWS Cloud for social good. These organizations focus their efforts year-round for the betterment of people around the world.

Whether they fight for the environment, work to achieve medical breakthroughs, preserve the arts, or focus on building social good, AWS can help organizations pay for only the technology they use, freeing up IT spend to grow their capabilities for the long term. When an IT burden is lifted through AWS, the nonprofit organization is able to put their time and effort to make the world a better place.

The four mission-driven organizations highlighted in this first entry in a series focus on helping global citizens, while we focus on supporting their IT.

Thorn – Digital Defenders of Children Dedicated to Driving Technology Innovation

Thorn’s mission is to drive technology innovation to fight human trafficking. They work to provide law enforcement with intelligence and leads about suspected human trafficking networks and individuals, with the ultimate goal of identifying victims and connecting them with resources. Spotlight, powered by Thorn and Digital Reasoning, processes and analyzes the data from 150,000 ads per day based on risk profiles provided by law enforcement. Since October 2014, they have analyzed 65 million ads and 400 million images in Spotlight. Over 3,200 law enforcement officers use Spotlight in all 50 states. They have also gone mobile. Daily Spotlight users report a 60% time savings in human trafficking investigations.

“AWS has been critical to our ability to deliver a world-class investigations tool that has helped officers across the country identify thousands of trafficking victims faster than ever before. We are grateful for this support that allows us to not worry about stability and storage – but instead focus our energy on constant improvement and innovation that helps stop trafficking and abuse,” said Julie Cordua, CEO, Thorn.

Check out the video and SlideShare presentation from Thorn’s session at re:Invent 2016.

Federation for Internet Alerts (FIA) – Leveraging the Internet to maximize the effectiveness of emergency alerts

FIA is a nonprofit organization that began in 2013 to distribute official child abduction alerts and tornado warnings across devices throughout the United States to those in the alerting area. FIA launched a working “Alert Hub” on the AWS Cloud in late 2015 in an effort to further standardize this practice and expand the availability of these alerts.  The reliability, speed, and reach of the Alert Hub is transforming the way vital information is spread through local communities.  The Alert Hub transmits free, life-critical emergency hazard information in under 200 milliseconds across the world to subscribers who can then use this data to disseminate emergency information to any affected area through use of Common Alerting Protocol (CAP). This free public service is always on, and has securely ingested, validated and distributed 720,000 official CAP alerts to subscribers.

“AWS is vital to FIA, providing cloud computing technology that aids in the transmission of official life-critical alerts,” says Jason Bier, FIA’s President.  “With the AWS Cloud, we are able to offer free subscriber access to the Alert Hub. We are encouraged by how many for-profit businesses and individual innovators are incorporating the Alert Hub into new and existing technologies to help save lives.”

Learn more here.

The World Bank Group – Working for a World Free of Poverty

The World Bank Group has set two goals for the world to achieve by 2030: end extreme poverty by decreasing the percentage of people living on less than $1.90 a day to no more than 3% and promote shared prosperity by fostering the income growth of the bottom 40% for every country.

In their work to eradicate extreme poverty, the World Bank Group needed modern technology and technology fit for purpose. They now use the AWS Cloud to disseminate information to their staff anytime, anywhere, and on any device.

“We have spent the last three and a half years executing on that plan. We have ripped out and upgraded almost every single application and platform in the Bank. We have completely changed the technology landscape,” said Stephanie von Friedeburg, Group Chief Information Officer and Vice President, Information and Technology Solutions, The World Bank Group.

Watch this video of Stephanie von Friedeburg talk about the organization’s cloud-first strategy and their transition to the cloud.

UNICEF – Imagine

UNICEF’s Imagine project brings together content from around the world as people upload videos of themselves singing John Lennon’s “Imagine.” At our AWS Symposium in 2015, David Ohana, Chief of Brand, UNICEF, talked about running workloads in the cloud and enabling better services for UNICEF.

“To make sure that the world we imagine for children is heard loud and clear by key decision makers, the UNICEF Imagine team is hugely grateful to the AWS team for their continued support of this project. We chose AWS for the scalability, availability, and global reach. And as a result, we now we have some peace of mind that this idea may work,” said David.

Watch this video where David shares how UNICEF is reaching a global audience in a more agile and cost effective way.


Inspiring work from our nonprofits showing the intersection of technology and social good! Check back in with our upcoming posts where we will share more about how our customers use the cloud to achieve their missions, from saving the environment to achieving medical breakthroughs. And visit our “Nonprofits and NGOs in the Cloud” page here.

Amazon Web Services and the National Science Foundation Spur Innovation in Big Data Research

The AWS Research Initiative (ARI) brings Amazon Web Services (AWS) and the National Science Foundation (NSF) together to spur innovation in Big Data research. Under the program on Critical Techniques, Technologies and Methodologies for Advancing Foundations and Applications of Big Data Sciences and Engineering (BIGDATA) a total of $26.5 million will be funded by NSF and the Office of Financial Research (OFR) in addition to $3 million in AWS promotional credits for a period of 3-4 years.

The program seeks novel approaches in computer science, statistics, computational science, and mathematics, along with innovative applications in domain science, including social and behavioral sciences, education, biology, engineering, and the physical sciences that lead to the further development of interdisciplinary data science.

Under the ARI program, AWS and NSF will respectively support and collaborate on groundbreaking research from all qualified scientists, engineers, and educators. Now techniques and technologies like cloud-based Artificial Intelligence, Machine Learning, Big Data analytics, and High Performance Computing (HPC) will help researchers maximize the value of their NSF grants to accelerate the pace of innovation.

“BIGDATA research provides a paradigm shift by putting smart in everything we do today including smart home, smart city, smart cars, smart health, and more. We are excited to collaborate with the NSF to foster innovations in the field,” said Sanjay Padhi, Ph.D, AWS Representative to the NSF.

There are two categories of proposals:

  • Foundations (F): those developing or studying fundamental theories, techniques, methodologies, and technologies of broad applicability to big data problems, motivated by specific data challenges and requirements.
  • Innovative Applications (IA): those engaged in translational activities that employ new big data techniques, methodologies, and technologies to address and solve problems in specific application domains. Projects in this category must be collaborative, involving researchers from domain disciplines and one or more methodological disciplines (computer science, statistics, mathematics, simulation and modeling, and more).

The AWS Research Initiative with NSF provides up to $3M in AWS promotional credits over a period of up to four years, or for the duration of the initiative. AWS will offer many services through ARI grants, including compute and data services. NSF will be responsible for selecting grant awardees.

“In today’s era of data-driven science and engineering, we are pleased to work with the AWS Research Initiative via the NSF BIGDATA program to provide cloud resources for our Nation’s researchers to foster and accelerate discovery and innovation,”  said Dr. Jim Kurose, Assistant Director of the National Science Foundation (NSF) for Computer and Information Science and Engineering Directorate (CISE).

To get started on your application, here are some cloud resources and tools for grant applicants:

To see how to apply, who qualifies, and more, visit:

Learn more about the program here.

Bringing Girls Who Code to re:Invent 2016

AWS was proud to help sponsor the 2016 Girls Who Code (GWC) Summer Immersion Programs for 1,500 high-school aged girls. To enable their work, AWS built a custom curriculum for the Girls Who Code teams to learn and build their projects in the cloud. At the conclusion of the seven-week program, students formed small teams and built web-based projects using the skills they gained during the summer.

A team of AWS experts reviewed the projects of the student teams who incorporated AWS into their projects. Two project teams were then selected to come to AWS re:Invent 2016 in Las Vegas. This provided students a chance to learn more about cloud computing, experience a large-scale tech conference, and share with other cloud enthusiasts their AWS-powered projects and passion for programming. The two selected projects were:

  • The Mercer of Durham, Seattle Girls Who Code Summer Camp (2 students): The Mercer of Durham is a “choose-your-own-adventure” game played on the Amazon Echo through the cloud-based Alexa voice service. The purpose of the game is to play an adventure game through the power of your voice, similar to role playing. It was inspired by the text-based “choose-your-own-adventure-game” that the group created in class on Python.
  • Kokua, Boston Girls Who Code Summer Camp (5 students): Kokua is a website that is used as a “cold caller” to select random students and as a random group generator to allow students the opportunity to work with different peers. In addition, the team created a bar graph displaying the statistics of how many times a student’s name was called. Kokua differs from traditional cold-calling devices because it organizes multiple functions into one tool that is easy for teachers to use. Coded using JavaScript, HTML, PHP, and CSS, Kokua saves teachers from worrying about who to call on next or keeping track of who is not participating.

“Attending AWS re:Invent gave us the opportunity to interact with world-class programmers and engineers and a chance to share our final project from Girls Who Code. Thank you AWS and GWC for sponsoring us!” shared the team from Kokua.

AWS is committed to helping build the pipeline of women and underrepresented communities in tech. As a part of this effort, we held a Diversify Tech panel at re:Invent. In the panel, experts in the field of diversity, equality, equity, inclusion, and innovation discussed actionable steps we can take, both individually and as companies, to improve diversity in tech. The Girls Who Code teams also presented their projects at the end of this panel, and received an opportunity to get to know Girls Who Code VP of Strategy and Innovation, Leah Gilliam, who moderated the discussion. You can watch the full panel here:

Spotlight on London: Londoners Use the AWS Cloud for their Daily Life and Work

Amazon Web Services has a strong commitment to the needs of our customers across sectors in the UK. That’s the driving reason why we recently launched a new Region in the London area. Learn more about the new Region here.

Cities like London are quickly embracing innovation and developing new ways for engaging and serving citizens. From transportation to planning to utilities, cities are using cloud computing to transform the way they interact with citizens and think about their future. Both government and commercial organizations are using the cloud to provide information and deliver services to their customers and citizens. Learn more about the organizations you know that are already working to bring you smarter, more flexible services in and around London. Read more public sector case studies here.

AWS works with organisations around London to serve citizens more effectively and reach broader constituents. Learn more below:

Register now to get started on your digital journey to future government

When it comes to digital government projects, where do you get started? How do you train your staff and align your technology strategy with the ever-increasing pace of citizen requirements?

To answer these questions (and learn even more), join us the week of January 23rd  and week of March 6th, 2017 at the Urban Innovation Center, where AWS and Future Cities Catapult will offer discussions, roundtables, and workshops as part of the London Innovation Series. Customize your own itinerary and learn how to build citizen services in a new and fresh way. Learn more and register now.

Continue to learn about how AWS is helping Londoners every day here and check out the “Webminster” station in the photos below.

Also in London, AWS launched AWS re:Start, a training and job placement program for the UK to educate young adults as well as those leaving the Armed Forces, Reservists, Veterans, and their spouses on the latest software development and cloud computing technologies. Learn more here.


Calling All Data Scientists to Help Improve Cancer Screening Technology

Two out of every five people in the U.S. will be diagnosed with cancer during their lifetimes and the number of new cancer cases will rise to 22 million globally within the next two decades, according to the National Cancer Institute (NCI). And as research organizations work to find a cure, the same technology behind improved voice assistants and credit card fraud detection—artificial intelligence—could also help improve cancer screening and save lives.

Through Machine Learning and Artificial Intelligence, participants of the third annual Data Science Bowl, have the chance to improve lung cancer screening technology that can reduce lung cancer deaths by 20 percent.

The Data Science Bowl competition was created by Booz Allen Hamilton in partnership with Kaggle. Amazon Web Services is proud to sponsor the 2017 Data Science Bowl, which aims to inspire everyday citizens, data scientists, and medical communities around the world to work together and improve the success rate of low-dose CT scanning, using training and test datasets directly provided or facilitated through the National Cancer Institute.

This year, the 90-day Data Science Bowl competition will award winners with over $1 million in prizes, including AWS cloud computing credits. The funds for the prize purse will be provided by the Laura and John Arnold Foundation. To learn more and participate in the Data Science Bowl, visit

Last year’s Data Science Bowl was related to heart health. Learn more about it here.

AWS Public Datasets

Today, qualified researchers can access two of the world’s largest collections of cancer genome data as AWS Public Datasets:

And, in order to help data scientists work with unique datasets, we built the AWS Research Cloud Program. The program was built by researchers, for researchers, in order to enable easy use of AWS resources by the scientific community around the globe. It’s free to join the program, and you can download the guide here to get started.

Key Resources for Researchers and Scientists

Additionally, below are some key resource links for researchers to help in the Data Science Bowl:

How Does the Cloud Help Cure Cancer?

The cloud can fuel cancer breakthroughs at a rapid speed and we are looking forward to seeing what the participants of the Data Science Bowl are able to achieve using the cloud. For example, The Algorithms, Machines, and People (AMP) Lab at the University of California Berkeley builds scalable machine learning and data analysis technologies that turn raw data into actionable research insights, shared globally.

Among the many experiments run by the AMP Lab, one area of concentration is in the field of genomics and cancer research. Due to the vast amount of data that genome sequencing produces, the AMP Lab leverages AWS cloud-based compute power to quickly scale the compute resources needed to analyze algorithms that are used in genomics work. As a result, researchers are able to use many machines in the cloud simultaneously, to process genome data faster and more cost effectively than ever before.

Learn how more customers, like American Heart Association, National Institute of Health, and Harvard Medical School, use the AWS Cloud to revolutionize our understanding of disease and develop novel approaches to diagnosis and treatment.

Good luck to all participants!

2016: A Year of Innovations (powered by the cloud)

Two thousand sixteen saw global moments impacted by the cloud: presidential elections, treating virus outbreaks, and even handling traffic after the Chicago Cubs won the World Series! As the headlines took note of how these events impacted the lives of people across the globe, cloud computing was working behind the scenes to keep these technology services always on, accessible, and easy to use. Whether for downloading images of the surface of Mars or plotting out the best route to work, the AWS Cloud helped global governments, educational institutions, and nonprofits innovate to deliver better services to citizens and students.

Take a look back at 2016 and how the AWS Cloud powered innovations in policing, health, smart cities, education, and more.

Cloud-Powered Policing

Law enforcement agencies depend on AWS for solutions across the AWS Partner Network (APN) to connect their communities and improve public safety. These solutions provide first responders with real-time data, making often difficult situations as transparent as possible before police arrive on the scene.

For example, sensor technology alerts police officers when gunshots are detected providing complete visibility that improves officer safety in the field. Read more about the future of policing in these blogs.

Cloud-Powered Health

Public Health officials use the AWS Cloud to build healthier communities. The cloud aids initiatives, like monitoring air and water quality or epidemic management, with the data needed to protect citizens. Smaller, more citizen-engaged projects, like assisted living, elderly care, and wearable health devices, help medical personnel deliver the best care to their patients.

Learn more about cloud-enabled innovation in personalized medical treatment in this e-book.

Cloud-Powered Smart Cities

While a “smart city” can mean many things, what makes a city smart remains the same: data. In a mobile-driven world, AWS can help cities of all sizes gather, store, and distribute data in the AWS Cloud. Cities can then make data-driven decisions, modernizing programs that deliver measurable results for citizens.

For example, through open data and cloud technology, Transport for London (TfL) was able to deliver new services to the public, impacting the 24 million daily commuters using the Tube, buses, roads, trams, and freight, which has led to improvements in reliability, customer experience, and significant cost savings.

Cloud-Powered Education

The AWS Cloud impacts all corners of a campus and beyond. The cloud sparks education innovation by helping to reduce costs, improve service delivery, and increase student access to education. Explore the Campus on a Cloud map to learn how and where universities use the cloud every day.

AWS has over 7,000 education customers globally using the cloud to solve challenges, including: disaster preparedness, scaling web applications during peak loads like enrollment or graduation, supporting faster time to research results, creation of a cloud-ready next generation workforce with AWS Educate, and improved student outcomes and persistence through learning analytics and big data analysis.

Beginning their cloud journey by moving their web environment to AWS, University of Maryland, College Park focused on becoming a campus with no data centers. Watch this video to learn how the university uses Amazon WorkSpaces to give students and faculty access to software anytime, anywhere, from any device.

As we ring in 2017, we look forward to the innovations our customers will deliver in the New Year.

Check out more of our customer case studies here.

Top Five Blog Posts from the AWS Government, Education, & Nonprofit Blog for 2016

Thank you all for reading our blog this past year! From veterans and researchers to educators and engineers, our customers are changing the game in the public sector with the cloud.

To end the year and ring in 2017, we have compiled the top five most-read blog posts from 2016.

  1. Cloud Transformation Maturity Model: Guidelines to Develop Effective Strategies for Your Cloud Adoption Journey – The Cloud Transformation Maturity Model offers a guideline to help organizations develop an effective strategy for their cloud adoption journey. This model defines characteristics that determine the stage of maturity, transformation activities within each stage that must be completed to move to the next stage, and outcomes that are achieved across four stages of organizational maturity, including project, foundation, migration, and optimization. Where are you on your journey? Read the post to learn more.
  2. AWS Educate Now Available to U.S. Veterans – U.S.-based veterans, transitioning military personnel, and their spouses are eligible to create an AWS Educate account to get access to the resources needed to accelerate cloud-related learning endeavors to help power civilian career success.  From the frontline to the classroom, AWS is committed to prepping the next generation of IT and cloud professionals. Get started today.
  3. A Practical Guide to Cloud Migration – To achieve full benefits of moving applications to the AWS platform, it is critical to design a cloud migration model that delivers optimal cost efficiency. This includes establishing a compelling business case, acquiring new skills within the IT organization, implementing new business processes, and defining the application migration methodology to transform your business model from a traditional on-premises computing platform to a cloud infrastructure. A Practical Guide to Cloud Migration: Migrating Services to AWS provides a high-level overview of the cloud migration process based on the AWS Cloud Adoption Framework (CAF) and is a great first read for customers who are thinking about cloud adoption.
  4. AWS Offers Data Egress Discount to Researchers – The pace of research is no longer limited by the availability of computing resources. Researchers are beginning to rely on cloud computing to drive breakthrough science at breakneck speeds and AWS wants to fuel the pace of new discoveries by making it possible for all scientists to have their very own supercomputers in the cloud. AWS committed to making it easier for scientists to use its cloud storage, computing, and database services by waiving data egress fees for qualified researchers and academic customers; these are fees associated with “data transfer out from AWS to the Internet.”
  5. IRS 990 Filing Data Now Available as an AWS Public Dataset – We announced that over one million electronic IRS 990 filings are available via Amazon Simple Storage Service (Amazon S3). Filings from 2011 to the present are currently available and the IRS will add new 990 filing data each month. Collaborating with the IRS allows us to improve access to this valuable data.

See you all in the New Year!

What’s New for AWS Storage & Ingestion Services from re:Invent 2016

We hope you have had a chance to catch up on the security and compute services announced at re:Invent. Next up, we have the re:Invent updates on storage and ingestion that will benefit our public sector customers.

AWS Snowball Edge – Petabyte-scale Data Transfer with On-Board Compute

AWS Snowball Edge is our newest 100TB data transfer device, offering highly secure, on-board storage and in-flight compute capabilities with AWS Greengrass. Organizations can use AWS Snowball Edge to move massive amounts of data into and out of the AWS Cloud, use the device as a temporary storage tier for large local datasets, or seamlessly support edge workloads in remote or offline locations.

Snowball Edge connects to your organization’s existing applications and infrastructure using standard storage interfaces, streamlining the data transfer process, minimizing setup and integration, and helping ensure that the applications continue to run even when they are not able to access the cloud.

How does Snowball Edge accelerate data transfer to the cloud?

It has four times the network speed of the original AWS Snowball, built-in WiFi and cellular wireless communication, a Network File System (NFS) interface, and an Amazon S3-compatible endpoint. The device automatically encrypts all data stored. Encryption keys are managed with the AWS Key Management Service (KMS) and never stored on the device, ensuring that your most sensitive data is secure on site and in transit to AWS.

The AWS Snowball Edge device also comes with AWS Greengrass embedded, so you can execute AWS Lambda functions and process data locally, making it possible to collect and analyze sensor data streams, transcode multimedia content, compress images in real time, or run a local Amazon S3-compatible file server.

How it works

Jobs are created right from the AWS Management Console. Once a job is created, AWS automatically ships a Snowball Edge device to you. When you receive the device, simply attach it to your local network and then connect your applications. Once the device is ready to be returned, the E Ink shipping label will automatically update to the correct AWS facility, and the job status can be tracked via Amazon SNS generated text or email messages, or directly in the console.

AWS Snowball & HIPAA Compliance

AWS has expanded its HIPAA compliance program to include AWS Snowball, allowing you to transfer large amounts of data, including Protected Health Information (PHI), into and out of AWS securely and cost-effectively. Read the HIPAA Compliance whitepaper.

AWS Snowmobile – Move Exabytes of Data to the Cloud in Weeks, Not Years

Even with high-end connections, moving petabyte and exabyte-scale data to the cloud is challenging. Now migrating financial and regulatory records, scientific archives, and satellite imagery to the cloud won’t take years or decades. The AWS Snowmobile secure data truck stores up to 100 PB of data so customers can migrate data to the AWS Cloud in weeks.

Tamper-resistant AWS Snowmobile shipping containers attach to your network and appears as a local, NFS-mounted volume. Each AWS Snowmobile consumes about 350 kW of AC power and includes a network cable connected to a high-speed switch, capable of supporting 1 Tb/second of data transfer spread across multiple 40 Gb/second connections.

Snowmobile also incorporates multiple layers of logical and physical protection, including chain-of-custody tracking, 24/7 video surveillance and GPS tracking with cellular or satellite connectivity back to AWS. AWS Snowball offers 2560-bit encryption and encrypts with AWS Key Management Service (KMS) keys. We can even arrange for a security vehicle escort when the AWS Snowmobile is in transit and dedicated security guards while your AWS Snowmobile is on-premises.

New Amazon S3 Features

  • Amazon S3 CloudWatch Metrics – Understand and improve the performance of your applications that use Amazon S3 by monitoring and alarming on 13 new Amazon S3 CloudWatch metrics. For web and mobile applications that depend on cloud storage, these metrics allow you to quickly identify and act on operational issues. You can receive one-minute Amazon S3 CloudWatch Metrics, set CloudWatch alarms, and access CloudWatch dashboards to view real-time operations and performance.
  • Amazon S3 Object Tagging – With S3 Object Tagging, you can manage and control access for Amazon S3 objects. Amazon S3 Object Tags are key-value pairs applied to Amazon S3 objects which can be created, updated, or deleted at any time during the lifetime of the object. You’ll also have the ability to create Identity and Access Management (IAM) policies, set up Amazon S3 Lifecycle policies, and customize storage metrics.
  • Amazon S3 Analytics, Storage Class Analysis – With storage class analysis, you can analyze and visualize storage access patterns and transition the right data to the right storage, optimizing costs. You can configure a storage class analysis policy to monitor an entire bucket, a prefix, or object tag. This new Amazon S3 Analytics feature automatically identifies the optimal lifecycle policy to help you transition less frequently accessed storage to Standard Infrequent Access SIA and save.
  • Amazon S3 Inventory – You can simplify and speed up business workflows and big data jobs using Amazon S3 Inventory, which provides a scheduled alternative to Amazon S3’s synchronous List API. Amazon S3 Inventory provides a CSV (Comma Separated Values) flat-file output of your objects and their corresponding metadata on a daily or weekly basis for an Amazon S3 bucket or a shared prefix.

Learn More.

New Amazon EFS (Elastic File System) Features

Amazon EFS (Elastic File System) offers storage for use with Amazon EC2 instances and allows you to access file data from on-premises datacenters. You can now migrate file data to and from on-premises into Amazon EFS to support cloud bursting workloads and backups to the cloud when connected to your Amazon VPC with AWS Direct Connect.

New AWS Storage Gateway Provides File Interface to Objects in Amazon S3 Buckets

AWS Storage Gateway now provides a virtual on-premises file server, which enables you to store and retrieve Amazon S3 objects through standard file storage protocols. With file gateway, existing applications or devices can use secure and durable cloud storage without modification. File gateway simplifies moving data into Amazon S3 for in-cloud workloads, provides cost-effective storage for backup and archive workloads, or expands your on-premises storage into the cloud.

File gateway is available as a virtual machine image which you download from the AWS Management Console. To start using the new AWS Storage Gateway, click here.

Contact us to get started today with these new services.

What’s New for AWS Compute Services from re:Invent 2016

We recently recapped the security and compliance updates announced at this year’s re:Invent that are important to our public sector customers. AWS also expanded upon its core foundational services – compute and storage – by announcing new game-changing services and special features.

Check out the below compute updates and our follow-up post covering the storage services that every government, education, and nonprofit organization should know that will help them focus more of their time and resources on their core missions.

Amazon Lightsail – The Easiest Way to Get Started

Amazon Lightsail brings you the power of the AWS Cloud with the simplicity of a virtual private server (VPS). AWS components, such as servers and storage IP addresses, are automatically assembled in just a few clicks. You can choose a configuration from a menu and launch a virtual machine preconfigured with SSD-based storage, DNS management, and a static IP address. Launch your favorite operating system, developer stack, or application with flat-rate pricing starting at $5 per month. As your organization’s needs grow or change, simply connect to additional AWS database, messaging, and content distribution services without disruption.

Amazon EC2 Compute Services

In addition to the expansion of high I/O, compute-optimized, memory-optimized, and burstable EC2 compute instances, AWS now offers hardware acceleration with FPGA-based computing and elastic, add-on Graphical Processing Units (GPUs) that enhance EC2 instance capabilities.

  • Elastic GPUs – Add high performance graphics acceleration to existing EC2 instance types at a fraction of the cost of stand-alone graphics instances, with your choice of 1 GiB to 8 GiB of GPU memory and compute power to match. The Amazon-optimized OpenGL library automatically detects and makes use of Elastic GPUs, which are ideal if you need a small amount of GPU for graphics acceleration or have applications that could benefit from some GPU but also require high amounts of compute, memory, or storage.
  • F1 Instances – F1 instances give you access to programmable hardware known as a Field-Programmable Gate Array (FPGA) which can speed up many compute-intensive workloads by up to 30 times (e.g. HPC, genomics, encryption, and risk analysis workloads). F1 instances are easy to program and come with everything you need to develop, simulate, debug, and compile your hardware acceleration code.
  • R4 Instances – The next generation of R4 instances are designed for memory-intensive Business Intelligence and database applications and offers up to 488 GiB of memory. Instances are available in six sizes, with up to 64 vCPUs.
  • New T2 Instance Sizes – T2 instances offer great price performance for general purpose workloads, such as application servers, web servers, development environments, and small databases, or where you need to use the full CPU on a consistent basis. We’ve added the t2.xlarge (16 GiB of memory) and the t2.2xlarge (32 GiB of memory).
  • Coming Soon: C5 Instances – C5 instances will be based on Intel’s brand new Xeon “Skylake” processor, running faster than the processors in any other EC2 instance. As the successor to Broadwell, Skylake supports AVX-512 for machine learning inference, multimedia, scientific, and video processing, which require superior support for floating point calculations. Instances will be available in six sizes, with up to 72 vCPUs and 144 GiB of memory. Coming in early 2017.
  • Coming Soon: I3 Instances – I3 instances are equipped with fast, low-latency, and Non Volatile Memory Express (NVMe)-based Solid State Drives, to meet the needs of the most demanding I/O intensive relational and NoSQL databases, transactional, and data analytics workloads. They’ll deliver up to 3.3 million random IOPS at a 4 KB block size and up to 16 GB/second of disk throughput. Available in 2017.

IPv6 Support for Amazon EC2

Amazon EC2 instances in Amazon Virtual Private Cloud (VPC) now offer native support for the IPv6 protocol. Internet Protocol Version 6 (IPv6) is a new version of the Internet Protocol that uses a larger address space than its predecessor, IPv4. IPv6 support allows your organization to meet mandated requirements and removes the need for IPv6 to IPv4 translation software or systems.

With IPv6 enabled in a VPC, applications can be secured in the same easy manner available today through security groups, network ACLs, and route tables. VPCs can now operate in a dual-stack mode with the ability to assign both IPv4 and IPv6 addresses on EC2 instances.

AWS Batch for ‘Big Compute’ Workloads

Researchers, scientists, and developers with parallel compute-intensive workloads can now avoid the challenge of buying and building clusters or waiting in job queues on-premises. AWS Batch offers fully managed batch compute capabilities with usage-based pricing, enabling Big Compute and HPC jobs to dynamically scale up and down in response to changing needs—without the heavy lift and costs of provisioning, managing, and maintaining clusters.

EC2 Systems Manager

Amazon EC2 Systems Manager is a management service that makes it simple and seamless for customers to manage their cloud and hybrid cloud environments by extending the elasticity and agility of the cloud into on-premises data centers. EC2 Systems Manager collects software inventory, applies OS patches, creates system images, configures Windows and Linux operating systems, and performs remote administration across your Amazon EC2 and on-premises systems. And it lets you record and govern your EC2 instance’s software configuration with AWS Config.


Blox is a collection of open source software that enables customers to build custom schedulers and integrate third-party schedulers on top of Amazon ECS (Elastic Container Service), which helps you build, run, and scale Docker-based applications, with the scheduler assigning tasks to Amazon EC2 instances. Blox consumes the Amazon ECS event stream, uses it to track the state of the cluster, and makes the state accessible via a set of REST APIs.

Check back in on our storage announcement recap post coming soon.