AWS Public Sector Blog

Category: Technical How-to

How to build secure data lakes to accelerate your nonprofit’s mission

Using data lakes, nonprofits can use data to influence strategy and inform decisions that produce value and impact. In this post, learn how to build a data lake, ingest data from a PostgreSQL server, give permissions to users to consume the data using AWS Lake Formation, and access and analyze the data using Amazon Athena.

How to accelerate CMMC compliance with the new AWS Compliant Framework

The AWS Compliant Framework is an automated solution designed to help customers reduce the time to setup an environment for running secure and scalable workloads while implementing an initial security baseline that meets US federal government standards. The solution was designed to address the requirements for deploying DoD CMMC and DoD Cloud Computing Security Requirements Guide compliant environments.

Covid-19 vaccine in vials in a laboratory

COVID-19 vaccination scheduling: Scaling REDCap with AWS

Vaccine demand brought unprecedented load to the launch of Texas A&M Health’s vaccination sign-up site. The Texas A&M Health team used AWS to develop a solution to reduce outages and errors, and scale REDCap to get vaccines to Texans.

Driving innovation in single-cell analysis on AWS

Computational biology is undergoing a revolution. However, the analysis of single cells is a hard problem to solve. Standard statistical techniques used in genomic analysis fail to capture the complexity present in single-cell datasets. Open Problems in Single-Cell Analysis is a community-driven effort using AWS to drive the development of novel methods that leverage the power of single-cell data.

UT Austin connects students with answers faster using Amazon Connect

The College of Liberal Arts at University of Texas at Austin wanted to make it simple for students, faculty, and staff to contact support agents. This is how they built and scaled a contact center solution on AWS with Amazon Connect that reduced call wait time, cut costs, and more easily resolved technical issues — all while call volume more than quadrupled.

telecomm satellite image transfering data to Earth

Automated Earth observation using AWS Ground Station Amazon S3 data delivery

With AWS Ground Station, you can now deliver data directly into Amazon S3 buckets. This simplifies downlinking because you no longer need to run an Amazon EC2 receiver instance. It also saves you cost and simplifies the creation of automated processing pipelines like the one we are going to show in this blog. By using an automated Earth observation (EO) pipeline, you can reduce the operating burden of your staff, as after scheduling a contact, everything is handled automatically and you’ll get a notification when the processed data is available. Read on to learn how to create an automated EO pipeline that receives and processes data from the NOAA-20 (JPSS-1) satellite, using this new AWS Ground Station feature.

illustration of two hands coming from opposite sides of photo, one with cash, another with lightbulb, on blue background

Using machine learning to help nonprofits with fundraising activities

Nonprofits can leverage the cloud to reduce the burden associated with their fundraising activities. With machine learning (ML), nonprofits can identify individuals who are more likely to engage and donate to their cause to support their mission. Read more to learn exactly how you can put these solutions into action and leverage ML to help your nonprofit with fundraising efforts. In this post, discover how to use Amazon Personalize to build a ML model that supports a wide-range of personalization experiences—without prior machine learning experience.

How to manage Amazon SageMaker code with AWS CodeCommit

How to manage Amazon SageMaker code with AWS CodeCommit

To help protect investments on ML, government organizations can securely store ML source code. Storing Amazon SageMaker Studio code in an AWS CodeCommit repository enables you to keep them as standalone documents to reuse in the future. SageMaker Studio provides a single, web-based visual interface where you can perform all ML development steps required to prepare data and build, train, and deploy models. Read on to learn the steps to configure a git-based repository on CodeCommit to manage ML code developed with SageMaker.

human genome

Accelerating genome assembly with AWS Graviton2

One of the biggest scientific achievements of the twenty-first century was the completion of the Human Genome Project and the publication of a draft human genome. The project took over 13 years to complete and remains one of the largest private-public international collaborations ever. Advances since in sequencing technologies, computational hardware, and novel algorithms reduced the time it takes to produce a human genome assembly to only a few days, at a fraction of the cost. This made using the human genome draft for precision and personalized medicine more achievable. In this blog, we demonstrate how to do a genome assembly in the cloud in a cost-efficient manner using ARM-based AWS Graviton2 instances.

man and son work in woodshop to build birdhouse

Purpose-built databases: The model for building applications in the cloud

The era of the cloud has simply accelerated the push to microservices as organizations want to adopt new, distributed models for building applications to drive agility, innovation, and efficiency. The AWS portfolio of purpose-built databases can help with this movement. AWS offers a broad and deep portfolio of purpose-built databases that support diverse data models and allow customers to build data driven, highly scalable, distributed applications. This allows you to pick the best database to solve a specific problem and break away from restrictive commercial databases to focus on building applications to meet the needs of their organization.