AWS Public Sector Blog

Category: Technical How-to

How KHUH built a long-term storage solution for medical image data with AWS

King Hamad University Hospital (KHUH) and Bahrain Oncology Center is a 600-bed-hospital in Bahrain. Over the years, KHUH faced constraints with exponential growth of their on-premise storage needs, particularly with the medical images stored by their picture archiving and communication system (PACS). KHUH turned to AWS to develop a cost- and time-effective long-term storage solution, without making changes to their existing PACS, that reduced storage costs by 40%.

Getting started with healthcare data lakes: Using microservices

Data lakes can help hospitals and healthcare organizations turn data into insights and maintain business continuity, while preserving patient privacy. This blog post is part of a larger series about getting started with setting up a healthcare data lake. In this blog post, I detail how the solution has evolved at a foundational level over the series to include microservices. I describe the design decisions I’ve made and the additional features used. You can access code samples for this solution through a GitHub repo for reference.

How public sector agencies can identify improper payments with machine learning

To mitigate synthetic fraud, government agencies should consider complementing their rules-based improper payment detection systems with machine learning (ML) techniques. By using ML on a large number of disparate but related data sources, including social media, agencies can formulate a more comprehensive risk score for each individual or transaction to help investigators identify improper payments efficiently. In this blog post, we provide a foundational reference architecture for an ML-powered improper payment detection solution using AWS ML services.

Enhance operational agility and decision advantage with AWS Snowball Edge

In a data-dependent world, success belongs to the side with decision advantage: the ability to acquire data and make sense of a complex and adaptive environment, and act smarter and faster than the competition. Understanding global environments requires more than just more data – it requires live two- and three-dimensional maps, new support tools, improved processes, seamless connectivity, and better collaboration that can scale to the needs of the environment. This blog post explores how to address challenges of big data and accelerate time to data insights with machine learning with AWS Snowball Edge device deployment at the edge.

AWS branded background design with text overlay that says "Move file data in and out of AWS GovCloud (US) with AWS DataSync"

Move data in and out of AWS GovCloud (US) with Amazon S3

Increasingly, AWS customers are operating workloads both in AWS GovCloud (US) and standard AWS Regions. Dependencies between workloads, changing data controls, or enrichment of data across multiple data levels are examples of business needs that may require moving data in and out of AWS GovCloud (US). In this blog post, I explain how to move data between Amazon Simple Storage Service (Amazon S3) buckets in the AWS GovCloud (US) and standard partitions.

Move file data in and out of AWS GovCloud (US) with AWS DataSync

As public sector customers find increasing need to move data between the AWS GovCloud (US) partition and the standard partition, they need tools to help them lower their operational burden. In this blog post, I walk through how to use AWS DataSync to move data on network file system (NFS) shares between the two partitions.

Virtualizing the satellite ground segment with AWS

As the number of spacecraft and spacecraft missions accelerates, moving aerospace and satellite operations to the cloud via digital transformation — including virtualizing the ground segment — is key for economic viability. In this blog post, we explain the benefits of virtualizing the ground segment in the cloud and present the core components of a reference architecture that uses AWS to support several stages of a comprehensive ground segment virtualization. Then, working from this model, we present additional reference architectures for virtualizing the ground segment that can accommodate various requirements and usage scenarios.

Create a common operating picture for search and rescue at the edge with AWS

In a recent disaster response field testing exercise (FTX), the AWS Global Social Impact Solutions (GSI) team developed a prototype cloud architecture and tested it in a search and rescue (SAR) scenario simulating a missing responder crisis. This blog post walks through the SAR simulation and result, and provides an overview of the AWS services and technical architecture components the GSI team used to provide a hybrid edge/cloud COP solution that helped locate the missing team member in the simulated scenario within 20 minutes.

Create a secure and fast DevSecOps pipeline with CircleCI

In this blog post, we explain how government agencies can accelerate their development workflows while maintaining strict application and operational security using the principles of continuous integration and continuous delivery (CI/CD) and DevSecOps. We provide a solution to walk you through how you can quickly set up your own DevSecOps pipeline that incorporates AWS and third-party security tools to give you a fast, flexible, and secure software delivery process.

How to deliver performant GIS desktop applications with Amazon AppStream 2.0

Geospatial datasets are increasingly large, reaching terabyte and even petabyte scale, which can cause many challenges for geospatial analysts and educators–but Amazon AppStream 2.0 can provide some solutions. In this blog post, we walk through how to deploy QGIS, a no cost, open-source geospatial information system (GIS) application used by geospatial analysts, in Amazon AppStream 2.0. We also load an example dataset to demonstrate how desktop GIS application users can access large, cloud-hosted geospatial datasets with high performance by keeping the data and compute components together on the cloud, and streaming the desktop application instead of downloading the data itself.