AWS Public Sector Blog

Category: Artificial Intelligence

Extracting, analyzing, and interpreting information from Medicaid forms with AWS

Extracting, analyzing, and interpreting information from Medicaid forms with AWS

What if paper forms could be processed at the same speed as digital forms? What if their contents could be automatically entered in the same database as the digital forms? Medicaid agencies could analyze data in near real time and drive actionable insights on a single dashboard. By using artificial intelligence (AI) and machine learning (ML) services from AWS, Medicaid agencies can create this streamlined solution. In this walkthrough, learn how to extract, analyze, and interpret relevant information from paper-based Medicaid claims forms.

How one council used machine learning to cut translation costs by 99.96%

Swindon Borough Council serves a multicultural community of 230,000 citizens in the south of England. We continuously review emerging and evolving technologies, seeking to leverage them to reimagine and improve services, lower costs, and enhance efficiency. The success of a recent translation project that harnesses the power of artificial intelligence (AI) and cloud services is a prime example of this.

Sarah Storelli and Dominic Delmolino from the AWS Behind the Cloud video series.

Advancing generative AI and technology for good: Insights from AWS executive Dominic Delmolino

In the second episode of the AWS Behind the Cloud series, Dominic Delmolino, vice president of field technology and engineering for AWS worldwide public sector (WWPS) speaks on the big opportunities for generative AI, the importance of customer obsession in building long-term solutions for the public sector, and why it’s important, above all else, to make sure technology is used for good.

36 new or updated datasets on the Registry of Open Data: AI analysis-ready datasets and more

36 new or updated datasets on the Registry of Open Data: AI analysis-ready datasets and more

This quarter, AWS released 36 new or updated datasets. As July 16 is Artificial Intelligence (AI) Appreciation Day, the AWS Open Data team is highlighting three unique datasets that are analysis-ready for AI. What will you build with these datasets?

A framework to mitigate bias and improve outcomes in the new age of AI

A framework to mitigate bias and improve outcomes in the new age of AI

Artificial intelligence (AI) and machine learning (ML) technologies are transforming many industries. But although public sector organizations are realizing the benefits of these technologies, there are many remaining challenges, including biases and a lack of transparency, that limit the wider adoption to unlock the full potential of AI and ML. In this post, learn a high-level framework for how AWS can help you address these challenges and provide better outcomes for constituents.

Largest metastatic cancer dataset now available at no cost to researchers worldwide

The NYUMets team, led by Dr. Eric Oermann at NYU Langone Medical Center, is collaborating with AWS Open Data, NVIDIA, and Medical Open Network for Artificial Intelligence (MONAI), to develop an open science approach to support researchers to help as many patients with metastatic cancer as possible. With support from the AWS Open Data Sponsorship Program, the NYUMets: Brain dataset is now openly available at no cost to researchers around the world.

Optimizing your nonprofit mission impact with AWS Glue and Amazon Redshift ML

Nonprofit organizations focus on a specific mission to impact their members, communities, and the world. In the nonprofit space, where resources are limited, it’s important to optimize the impact of your efforts. Learn how you can apply machine learning with Amazon Redshift ML on public datasets to support data-driven decisions optimizing your impact. This walkthrough focuses on the use case for how to use open data to support food security programming, but this solution can be applied to many other initiatives in the nonprofit space.

Decrease geospatial query latency from minutes to seconds using Zarr on Amazon S3

Decrease geospatial query latency from minutes to seconds using Zarr on Amazon S3

Geospatial data, including many climate and weather datasets, are often released by government and nonprofit organizations in compressed file formats such as the Network Common Data Form (NetCDF) or GRIdded Binary (GRIB). As the complexity and size of geospatial datasets continue to grow, it is more time- and cost-efficient to leave the files in one place, virtually query the data, and download only the subset that is needed locally. Unlike legacy file formats, the cloud-native Zarr format is designed for virtual and efficient access to compressed chunks of data saved in a central location such as Amazon S3. In this walkthrough, learn how to convert NetCDF datasets to Zarr using an Amazon SageMaker notebook and an AWS Fargate cluster and query the resulting Zarr store, reducing the time required for time series queries from minutes to seconds.

Using machine learning to customize your nonprofit’s direct mailings

Many organizations perform direct mailings, designed to support fundraising or assist with other efforts to help further the organization’s mission. Direct mailing workflows can use everything from a Microsoft Word mail merge to utilizing a third-party mailing provider. By leveraging the power of the cloud, organizations can take advantage of capabilities that might otherwise be out of reach, like customized personalization at scale. In this walkthrough, learn how organizations can utilize machine learning (ML) personalization techniques with AWS to help drive better outcomes on their direct mailing efforts.

Supporting health equity with data insights and visualizations using AWS

In this guest post, Ajay K. Gupta, co-founder and chief executive officer (CEO) of HSR.health, explains how healthcare technology (HealthTech) nonprofit HSR.health uses geospatial artificial intelligence and AWS to develop solutions that support improvements in healthcare and health equity around the world.