Category: government


Announcing the 2017 City on a Cloud Finalists

This year’s City on a Cloud Innovation Challenge saw a record number of applications from 15 different countries around the globe. Nominations included cities, counties, municipalities, and, for the first time, school districts using the cloud to improve the lives of the citizen.

Click here to learn more about the finalists and their projects, including real-time transportation applications, public safety services, and GIS projects, all using the AWS Cloud. Thank you to all who applied!

We will be announcing the winners at the AWS Public Sector Summit June 12-14th in Washington, DC! Join us there as we announce the winners in the Best Practices, Dream Big, and Partners in Innovation categories.

Congratulations to all of the finalists!

Best Practices: The Best Practices Award will be granted to a local or regional government leader, or public or private school or district who has deployed an innovative solution to solve a government challenge.

  • Alameda County
  • Benton Police
  • Boulder County
  • Caltrans (with Connected Corridors and UC Berkeley)
  • Charlottesville City Public Schools
  • City of Frederick, MD
  • City of San Diego
  • Douglas County GIS
  • Douglas Omaha Technology Commission
  • Intermediate School District 287
  • Lawrence Police Department
  • Liverpool City Council
  • New York Public Library
  • Region of Waterloo – Grand River Transit
  • Solodev and Seminole Public Schools
  • South Australia Government
  • Transport for London

Dream Big: The Dream Big Award will be granted to a city or school who has a great idea that they would like to implement.

  • Alton Central Schools
  • Beyond Machine to Machine Communications
  • City of Florence/ pdensity
  • City of Iowa City
  • City of Las Vegas
  • City of Ottawa
  • City of Plano
  • City of Salinas
  • City of Upper Arlington
  • City of West Sacramento
  • Louisville Metro Government
  • Marmion Academy
  • Santa Ana Unified School District
  • Seattle Public Schools
  • Tulsa Public Schools
  • WSIPC

Partners in Innovation: The Partners in Innovation Award will be granted to a technology partner who has deployed an innovative solution to solve a government or teaching and learning challenge.

  • Acivilate
  • Anthemis Technologies
  • Bang the Table
  • BeeToBit
  • Blue Spurs
  • CityGrows
  • FastFit
  • LearnZillion
  • See.Sense
  • Stroud Water Research Center
  • Tolemi
  • Universidad de Sevilla
  • Visual Labs
  • Xaqt

Learn more about the City on a Cloud Innovation Challenge here.


Join Intel and AWS at the 2017 AWS Public Sector Summit for over 100 learning opportunities. Register today!

New AWS Training Bootcamps to Help You Build Technical Skills at the AWS Public Sector Summit

New to the AWS Public Sector Summit this year in Washington, DC, you can choose from four full-day bootcamps available on Monday, June 12th.

AWS Training Bootcamps are full-day training sessions that offer you a chance to learn about AWS services and solutions through immersive exercises and hands-on labs. Delivered by experienced AWS Instructors and Solution Architects, these bootcamps allow you to work directly with AWS knowledge experts to get your questions answered.

Choose from one of the four below:

  • AWS Technical Essentials – Audience Level: Introductory – AWS Technical Essentials is a one-day, introductory-level bootcamp that introduces you to AWS products, services, and common solutions. It provides you with fundamentals to become more proficient in identifying AWS services so that you can make informed decisions about IT solutions based on your business requirements and get started working on AWS. Learn more.
  • Secrets to Successful Cloud Transformations – Audience Level: Introductory – Secrets to Successful Cloud Transformations is a one-day, introductory-level bootcamp that teaches you how to select the right strategy, people, migration plan, and financial management methodology needed when moving your workloads to the cloud. This course provides guidance on how to build a holistic cloud adoption plan and how to hire people who will execute that plan. You will learn best practices for choosing workloads to migrate from your on-premises environment to AWS. In addition, you will also learn best practices for managing your AWS expenses and dealing with internal chargebacks. Learn more.  Note: This course focuses on the business, rather than the technical, aspects of cloud transformation.
  • Building a Serverless Data Lake – Audience Level: Advanced – Building a Serverless Data Lake is a one-day, advanced-level bootcamp designed to teach you how to design, build, and operate a serverless data lake solution with AWS services. The bootcamp will include topics such as ingesting data from any data source at large scale, storing the data securely and durably, enabling the capability to use the right tool to process large volumes of data, and understanding the options available for analyzing the data in near-real time. Learn more.
  • Running Container-Enabled Microservices on AWS – Audience Level: Expert – Running Container-Enabled Microservices on AWS is a one-day, expert-level bootcamp that provides an in-depth, hands-on introduction to managing and scaling container-enabled applications. This full-day bootcamp provides an overview of container and microservice architectures. You will learn how to containerize an example application and architect it according to microservices best practices. Hands-on labs that feature the AWS container-focused services show you how to schedule long-running applications and services, set up a software delivery pipeline for your microservices application, and implement elastic scaling of the application based on customer load. Learn more.

All students must bring their own devices (requires dual-core processor with 4GB of RAM). Each bootcamp is $600 and must be reserved in advance. Enter the code PSBOOT100 to get $100 off your ticket. Space is limited, save your spot!

How to Achieve AWS Cloud Compliance with AWS, Allgress, and CloudCheckr

Assessing and measuring compliance requirements can be a full-time job. To mitigate risks, organizations must plan for cloud-based risk treatments, reporting and alerts, and automated responses to maintain security and compliance, as well as modernize their governance at scale.

AWS and its Amazon Partner Network (APN) security partners are developing security and compliance tools to enable customer security capabilities and architecture approaches for meeting and implementing advanced security competencies on AWS.  For example, Allgress and CloudCheckr are working together to solve security and compliance challenges and provide greater transparency of what tool, service, and partner solutions should be used to manage security, continuously treat risk, and automate cloud services.

The Regulatory Product Mapping Tool (RPM) was developed to reduce complexities, increase speed, and shorten the timeframe to develop compliant architectures on AWS. The RPM tool interactively maps FedRAMP (NIST 800-53) controls to AWS services and APN solutions. Below is an interactive visual representation of all the FedRAMP R4 Moderate controls. The inner ring displays the domains and the outer ring displays the sub-domains. By clicking on the slices within the interactive RPM tool, customers can review the AWS inherited, shared, and the associated Technology and Consulting Partner controls. Try it using the guest login here.

You can also map and align AWS Technology Partner solutions to controls and provide detailed control treatments. This can be used to document, configure, and help automate security and compliance management. Additionally, partner solutions are directly linked to the AWS Marketplace.

AC-3 Access Enforcement – Control Treatments: CloudCheckr allows you to tag AWS accounts and create groups of AWS accounts. These groups are known in CloudCheckr as Multi-Account Views. You can also create a Multi- Account View for all AWS accounts in a single view. Follow the steps here to get your Multi-Account Views up and running. Once that is completed, best practice checks will be pulled from all of the tagged AWS accounts into a single best practices report.

AU-5 Response to Audit Processing Failures – Control Treatment: AWS CloudTrail provides activity monitoring capability for the AWS management plane. CloudTrail records every call into the AWS API. Any activity in AWS is recorded into the CloudTrail logs. CloudTrail logs are written into an S3 bucket as JSON files. A separate file is written every five minutes. Additionally, a different file is created for each AWS account and each region. The CloudTrail UI provides basic functionality to look up events for up to seven days. One of the easiest ways to keep track of your CloudTrail configuration is by using the CloudCheckr best practice checks.

View the recorded webinar with AWS, Allgress, and CloudCheckr to learn how to achieve and demonstrate compliance in the cloud to satisfy the auditors, streamline reporting of technical and non-technical controls, and improve workflow across your key stakeholders.

A Guide to Backup and Recovery in the DoD

As the growth of Department of Defense (DoD) data accelerates, the task of protecting it becomes more challenging. Questions about the durability and scalability of backup methods are commonplace, including this one: How does the cloud help meet my backup and archival needs?

The mission-critical nature of data within the DoD means that business continuity ensures that tech infrastructure and systems continue to operate or recover quickly, despite serious disasters. Currently, defense agencies may be backing up to tape, sending data to a base or contractor site, or sending to a third party to distribute and store with little control and significant expense. Then, when it is time to do a restore, it can take weeks to recover the petabytes of data.

With the AWS Cloud, those weeks to recover the data can be reduced to hours by using Amazon Simple Storage Service (Amazon S3) or Amazon Glacier for long-term backup. DoD backup data can sit in any AWS Region in the US, not only reducing costs but also reducing the requirements to provide backup connectivity.

Public sector organizations are using the AWS Cloud to enable faster DR of their critical IT systems without incurring the infrastructure expense of a second physical site. The AWS Cloud supports many popular DR architectures from “pilot light” environments that are ready to scale up at a moment’s notice to “hot standby” environments that enable rapid failover. Learn more about how to rapidly recover mission-critical systems in a disaster here.

Where to start?

When you develop a comprehensive strategy for backing up and restoring data, you must first identify the failure or disaster situations that may occur and their potential mission impact. Within the DoD, you must also consider regulatory requirements for data security, privacy, and records retention.

Read below for steps to get started with disaster recovery:

  • Start somewhere and scale up: Choose what needs to failover and what does not. Some things may be more important than others, and some may still be working. A hybrid architecture approach can be an option based on who the mission owner is, the application, connectivity, and the Impact Level. Depending on the backup solution, you could archive to AWS, while maintaining recent backups on-premises.
  • Increase your security posture in the cloud: AWS provides a number of options for access control and encrypting data in transit and at rest.
  • Meet compliance requirements: Data custody and integrity must be maintained. The Commercial Cloud Security Requirements Guide (CC SRG) lays the framework for data classification and how cloud providers and DoD agencies must work to control access. The AWS Cloud meets Impact Level 2 (IL-2) for all CONUS regions, has a PATO for IL-4, and waivers for IL-5 in the AWS GovCloud (US) Region. This allows DoD mission owners to continue to leverage AWS for their mission-critical production applications.
  • Test the system: DR plans can often go untested until a major change is made to the system requiring documentation updates. With AWS, you can test whether the backup was successful, by spinning up and validating the backup data completed successfully and compare it to the existing environment on premises.

In the field, backing up to Amazon S3

AWS works with many of the industry-leading backup and recovery solution providers and backup storage manufacturers. This makes backing up to the cloud even easier by providing direct targeted access via API calls to AWS Cloud storage solutions. Many of these solutions can also help to instantiate backup data tests or entire DR environments in minutes.

For example, defense teams are leveraging CommVault media servers that point to a NetApp AltaVault appliance as an on-premises caching mechanism. The Altavault uses an S3 API call to push the backups to S3 buckets in the AWS GovCloud (US) Region. The customer’s media servers were able to target multiple storage solutions to test the best case scenario, pushing backups to their existing tape library and the Altavault appliance and S3 simultaneously. S3 was determined to be the lowest cost solution for long-term data storage. This solution eliminated the need for their tape library hardware refresh, as well as eliminated the need for off-site tape set rotations, resulting in cost savings and operational improvements.

Download our “Backup and Recovery Approaches Using AWS” whitepaper here for the technical steps agencies take to get started today.


Whether you are interested in backup and recovery, security, or DevOps, there is something for everyone at the AWS Public Sector Summit June 12-14 in Washington, DC. Join Telos and AWS, and register today!

AWS Lambda Is Now Available in the AWS GovCloud (US) Region

Serverless Computing Tailored for Regulated IT Workloads and Sensitive Controlled Unclassified Information (CUI) Data

AWS Lambda, a serverless compute service, is now available in the AWS GovCloud (US) Region, Amazon’s isolated cloud region built for sensitive data and regulated workloads.

Lambda now enables developers to run code in AWS GovCloud (US) without provisioning or managing servers. Lambda executes your code only when needed and scales automatically, from a few requests per day to thousands per second. You pay only for the compute time you consume; there is no charge when your code is not running. With Lambda, you can run code for virtually any type of application or backend service – all with zero administration commitment.

You can use Lambda to run your code in response to events, such as:

  • Changes to data in an Amazon S3 bucket or an Amazon DynamoDB table.
  • Invoking your code using API calls made using AWS SDKs.

With these capabilities, you can use Lambda to easily build data processing triggers for AWS services such as S3 and DynamoDB, process streaming data stored in Amazon Kinesis, or create your own backend that operates at AWS scale, performance, and security.

How does Lambda work?

Upload your code as Lambda functions and Lambda takes care of everything required to run and scale your code with high availability. Lambda seamlessly deploys your code, runs your code on a high-availability compute infrastructure, and performs all of the administration of the compute resources. This includes server and operating system maintenance, capacity provisioning and automatic scaling, and code monitoring and logging through Amazon CloudWatch. All you need to do is supply your code in one of the languages that Lambda supports (currently Node.js, Java, C#, and Python).

Lambda allows your code to access other AWS services securely through its built-in AWS SDK and integration with AWS Identity and Access Management (IAM).

Lambda can also run your code within a Virtual Private Cloud (VPC) by default. You can optionally configure Lambda to access resources behind your own VPC, allowing you to leverage custom security groups and network access control lists to provide your Lambda functions access to your resources within a VPC.

Examples of Lambda in Use – NASA’s Jet Propulsion Laboratory (JPL) & The Financial Industry Regulatory Authority (FINRA)

JPL is a known innovator in Space. Much of the data JPL uses is sensitive. Running Lambda in AWS GovCloud (US) will allow JPL to run AWS IoT and serverless computing on numerous mission workloads. This will enable JPL to save a lot of money and effortlessly run at huge scale as we search for answers to the big questions regarding life in Space, finding Earth 2.0, protecting Earth, and more.

FINRA used AWS Lambda to build a serverless data processing solution that enables them to perform half a trillion data validations on 37 billion stock market events daily. “We found that Lambda was going to provide us the best solution, for this serverless cloud solution. With Lambda, the system was faster, cheaper, and more scalable. So at the end of the day, we’ve reduced our costs by over 50% … and we can track it daily, even hourly,” said Tim Griesbach, Senior Director, FINRA, in his 2016 re:Invent talk. Regardless of data volume, any file is available in under one minute. And, they have less infrastructure to manage now.

How to get started

  1. Create an AWS account and select AWS GovCloud (US) as your region in the AWS Management Console.
  2. Choose Lambda in the AWS Management Console, and select your function by uploading your code (or building it right in the Lambda console) and choosing the memory, timeout, and IAM role.
  3. Specify the AWS resource and event to trigger the function, either a particular S3 bucket, DynamoDB table, or Kinesis stream.
  4. When the resource generates the appropriate event, Lambda runs your function and manages the necessary computing resources to keep up with incoming requests.

Learn more about AWS GovCloud (US) here or contact the AWS GovCloud (US) team with any questions.

The Importance and Necessity of Modernizing Government

Federal agencies are faced with increasing pressure to modernize their aging information technology systems and data centers. From global cybersecurity attacks to the constant pressure to do more with less, agencies must act to migrate to secure and modern IT systems. Today, the U.S. House of Representatives took an important step in providing agencies the necessary funding tools to modernize outdated federal systems by passing the Modernizing Government Technology (MGT) Act.

“Technology is evolving at a rapid pace and our citizens deserve a digital experience that keeps pace with innovation. We are pleased to see the bipartisan and bicameral Modernizing Government Technology Act, which enables the federal government to take advantage of commercial cloud services to lower IT costs, strengthen cybersecurity, and provide quick access to tremendous computing power and innovation,” said Teresa Carlson, Vice President, Worldwide Public Sector, AWS.

Cloud computing allows agencies to focus more on what matters most: their mission. Whether reducing wait times for veterans receiving healthcare, encouraging scientific research and space exploration, or providing first-rate education to the next generation, government has the ability to leverage technology to deliver better, faster, and more secure services to citizens.

“We applaud the numerous sponsors of the MGT Act for their leadership on this important piece of U.S. legislation and hope this will enable federal agencies to get the full benefit of commercial cloud services and emerging technologies,” said Teresa.

Learn more about how the cloud paves the way for innovation and supports world-changing projects in government here.

In Pursuit of a 1 Hour, $10 Genome Annotation

There are hundreds of scientists at the Smithsonian Institution who study just about every kind of life on earth, from animals and plants to fungi and bacteria. Since the initial publication of the human genome project in 2001, DNA sequencing technology has become more efficient and cost-effective, making it possible for individual biodiversity scientists to generate genome resources for their organisms of interest. These genomes can be the gateway to new research questions that were previously unanswerable.

For example, biodiversity genomics scientists face special challenges because they seek to understand genomes that range dramatically in size and complexity (e.g. some plant genomes are more than 10 times larger than the human genome). These scientists need agile software and hardware solutions that can be frequently updated to reflect the ever-increasing data behind algorithms and models.

To tackle these challenges, the Smithsonian’s Office of the Chief Information Officer recently established a Data Science Team, including Dr. Rebecca Dikow and Dr. Paul Frandsen. Part of their mission is to implement solutions that will accelerate science and lower the bar for entry to genomics research, not only for Smithsonian scientists but for biodiversity researchers in general. Although many large institutions have computing resources available to their researchers, there is a queue limit and significant costs to operating a high performance computing cluster. In addition, many smaller research institutions and universities may not have access to such resources.

Dikow and Frandsen are collaborating with AWS and Intel to improve a critical part of the genome analysis pipeline – annotation. Genome annotation is the process of identifying the locations of genes and other genomic features and determining their function, the first step in downstream applications of genomic data.

“Cloud technologies are a natural choice for annotation because different parts of a genome assembly (contigs or scaffolds) can be annotated in parallel, with the results being knitted together in a final step,” said Dikow. “The ability to scale up to many instances for brief periods will make annotation fast while remaining inexpensive.”

The Smithsonian’s Data Science Team is implementing existing annotation pipelines, such as MAKER (Canterel et al., 2008) and WQ_MAKER (Thrasher et al., 2012), as well as developing their own using the workflow engine Toil. Toil uses Common Workflow Language (CWL), which will allow the tools developed to be modular, portable, and scalable across thousands of AWS instances.

What makes these pipelines complex is the need to process each genome scaffold with multiple software tools in turn and to keep track of thousands of intermediate files and any failed tasks. The team has successfully implemented the first step in the annotation pipeline in Toil, which includes masking genome repeats with RepeatMasker (Smit et al., 2015), across 10 c3.xlarge instances. As they continue to make progress in the coming months, their code will be available on GitHub.

Rebecca Dikow presented the team’s progress at the first Global Biodiversity Genomics “BioGenomics” conference held in Washington DC , which was hosted by the Smithsonian Institution. This conference was a gathering of more than 300 genome and biodiversity scientists that focused on the methods and analysis of biodiverse genome data. There was great interest in the annotation pipeline currently under development. Check out the “Improving genome annotation strategies for biodiverse species using cloud technologies” slide deck presented at the conference here.

Achieve Total Cost of Operation Benefits Using Cloud

A core reason organizations adopt a cloud IT infrastructure is to save money. The traditional approach of analyzing Total Cost of Ownership no longer applies when you move to the cloud. Cloud services provide the opportunity for you to use only what you need and pay only for what you use. We refer to this new paradigm as the Total Cost of Operation in our latest white paper on “Maximizing Value with AWS.” You can use Total Cost of Operation (TCO) analysis methodologies to compare the costs of owning a traditional data center with the costs of operating your environment using AWS cloud services.

Get started with these cost-saving tips and download the whitepaper for more details:

  1. Create a culture of cost management: All teams can help manage costs, and cost optimization should be everyone’s responsibility. There are many variables that affect cost, with different levers that can be pulled to drive operational excellence.
  2. Start with an understanding of current costs: Having a clear understanding of your existing infrastructure and migration costs and then projecting your savings will help you calculate payback time, estimate ROI, and maximize the value your organization gains from migrating to AWS.
  3. Select the right plan for specific workloads: Moving business applications to the AWS Cloud helps organizations simplify infrastructure management, deploy new services faster, provide greater availability, and lower costs.
  4. Employ best practices: AWS delivers a robust set of services specifically designed for the unique security, compliance, privacy, and governance requirements of large organizations.

With a technology platform that is both broad and deep, professional services and support organizations, training programs, and an ecosystem that is tens of thousands of partners strong, AWS can help you move faster and do more.

Download the whitepaper to learn more.


Learn more about how to save money in the cloud and join CloudCheckr and AWS at the AWS Public Sector Summit June 12-14, 2017 in Washington, DC. Register today!

Announcing USAspending.gov on an Amazon RDS Snapshot

The Digital Accountability and Transparency Act of 2014 (DATA Act) aims to make government agency spending more transparent to citizens by making financial data easily accessible and by establishing common standards for the data all government agencies collect and share on the government website, USAspending.gov.

We are pleased to announce that the USAspending.gov database is now available for anyone to access via Amazon RDS. USAspending.gov data includes data on all spending by the federal government, including contracts, grants, loans, employee salaries, and more.

The data is available via a PostgreSQL snapshot, which provides bulk access to the entire USAspending.gov database, and is updated nightly. At this time, the database includes all USAspending.gov for the second quarter of fiscal year 2017, and data going back to the year 2000 will be added over the summer. You can learn more about the database and how to access it on the AWS Public Dataset landing page.

Now that this data is available as a public snapshot on Amazon RDS, anyone can get a copy of the USAspending.gov’s entire production database for their own use within minutes. Researchers and businesses who want to work with real data about US Government spending can quickly combine it with their own data or other data resources.

When data is made publicly available on AWS, anyone can analyze any volume of data without needing to download or store it themselves, enabling more innovation, more quickly. Users can use this data with the entire suite of AWS data analytics products and easily collaborate with other AWS users.

Learn more about how to launch your copy of the snapshot and how Amazon RDS can be used to share an entire relational database quickly and easily in the blog post here.

Last Week for City on a Cloud Applications: Deadline is May 12

The 2017 City on a Cloud Innovation Challenge will remain open for entries until Friday, May 12– so there is still time to apply!

Through the competition, AWS helps local and regional governments, schools, and districts innovate by simplifying IT workloads that they struggle with and depend on every day, such as Geographical Information Systems (GIS), Content Management Systems (CMS), Open Data portals, Learning Management System (LMS), and more.

Winners are given up to $50,000 in AWS Promotional Credits to support their cloud initiatives and encourage innovation and research.

Here are some examples of how cities are leveraging the AWS Cloud to solve for challenges facing their communities:

  • 311 mobile applications allow citizens better access to vital information and open lines of communication.
  • Open Data initiatives are allowing researchers to connect disparate datasets, drawing new insights with pre-existing data.
  • Justice and Public Safety strategies like wearable cameras and real-time information sharing are keeping first responders as informed as possible.

Looking for ideas before you submit your application? Get advice from a previous City on a Cloud Innovation Challenge Winner, Appriss Safety.

Apply today!