AWS Public Sector Blog
Tag: customer story
How healthcare organizations use generative AI on AWS to turn data into better patient outcomes
Healthcare organizations invest heavily in technology and data. Generative artificial intelligence (AI) empowers healthcare organizations to leverage their investments in robust data foundations, improve patient experience through innovative interactive technologies, boost productivity to help address workforce challenges, and drive new insights to accelerate research. This post highlights three examples of how generative AI on Amazon Web Services (AWS) is being used in healthcare and discusses ways to leverage this technology in a responsible, safe way.
ASTERRA helps build a more sustainable Earth by identifying and mitigating ‘lost water’ using AWS
ASTERRA, an Israel-based geospatial and Earth observation company, uses Amazon Web Services (AWS) to help water utilities and a number of industries identify and mitigate pipeline leaks. ASTERRA uses AWS to derive intelligence and insights from beneath the surface of their largest installations and to bypass the need to break ground and dig for leaks. Read this post to learn how AWS has helped ASTERRA overcome traditional on-premises infrastructure limitations and to accelerate the development of solutions for anticipating and mitigating failures, saving water, energy, and avoiding carbon dioxide (CO2) emissions as a result.
Renaissance Learning improves operations and scalability by migrating to AWS
Renaissance Learning is a global leader in the pre-K–12 education technology (EdTech) space. Renaissance delivers its products to 16 million students across the world, and its digital solutions are used in more than one-third of US schools and 100-plus countries. As part of its cloud consolidation strategy, Renaissance migrated DnA, its standards-based assessment creation and administration solution, from Google Cloud Platform (GCP) to Amazon Web Services (AWS). After migration, Renaissance realized performance, scalability, and security improvements, all while reducing cloud spend by 30 percent.
UC Davis Health Cloud Innovation Center, powered by AWS, uses generative AI to fight health misinformation
The University of Pittsburgh, the University of Illinois Urbana-Champaign (UIUC), the University of California Davis Health Cloud Innovation Center (UCDH CIC)—powered by Amazon Web Services (AWS)—and the AWS Digital Innovation (DI) team have built a prototype that uses machine learning (ML) and generative artificial intelligence (AI) to transform the public health communications landscape by giving officials the tools they need to fight medical misinformation, disinformation, and malinformation.
Why Fugaku, Japan’s fastest supercomputer, went virtual on AWS
Japan’s Mount Fuji is famous for its height and width but it’s also reachable by novice hikers without lots of time on their hands due to the nation’s efforts to make it accessible. Now, the researchers behind one of the world’s fastest supercomputers, Fugaku, which is another name for Mt. Fuji, are trying to make the supercomputer just as accessible on the Amazon Web Services (AWS) Cloud. Read this post to learn more.
Disaster response and risk management using PNNL’s Aether framework on AWS
The Pacific Northwest National Laboratory (PNNL) developed Aether as a reusable framework for sharing data and analytics with sponsors and stakeholders. Aether is a mature cloud-centered framework designed using Amazon Web Services (AWS) serverless services to provide a cost-effective and reliable environment for a dozen projects currently deployed with the framework. Read this post to learn more about how Aether’s serverless-first approach is enabling disaster response and risk management.
UK Biobank enables medical research worldwide through vast database powered by AWS
UK Biobank, the world’s most comprehensive source of health data used for research, needed a purpose-built data platform with compute and data-storage capabilities that provided analysis tools in a centralized environment and the flexibility to manage increasing quantities of data. This led to the establishment and launch of the secure, cloud-based UK Biobank Research Analysis Platform (RAP), which is hosted on Amazon Web Services (AWS). Read this post to learn more about UK Biobank’s journey to becoming a globally accessible dataset for health researchers.
University of Nebraska-Omaha’s ITD Lab migrates to Amazon Aurora with Babelfish, reducing database costs
The IT for Development (ITD) Lab at the University of Nebraska at Omaha previously migrated their ETeams application to Amazon Web Services (AWS) because they did not want to manage the underlying infrastructure. With a limited staff, they could focus more on building new functionality in the ETeams application and less on performing mundane maintenance tasks. While the application was performing well, the ITD Lab’s director wanted to lower the monthly costs. This blog post explores an alternative to an SQL Server database that is just as performant but at a lower cost.
How Pearson improves its resilience with AWS Fault Injection Service
Chaos engineering, often misunderstood as intentionally breaking the production environment, aligns with the Amazon Web Services (AWS) Well-Architected Reliability pillar. Its purpose is to methodically simulate real-world disruptions in a controlled manner, spanning service providers, infrastructure, workloads, and individual components. In this blog post, we show how Pearson PLC, an AWS education technology (EdTech) customer, successfully implemented resilient architectures through chaos engineering using AWS Fault Injection Service (FIS).
ASPPH scales data curation for members with a data lake on AWS
The Association of Schools and Programs of Public Health (ASPPH) — a nonprofit association with a vision for improved health and well-being for everyone, everywhere — partnered with Amazon Web Services (AWS) Professional Services (AWS ProServe) to move their curated data to a managed data lake on AWS. In this blog post, we share how ASPPH and AWS designed and built the data lake and the results of moving to a modern, scalable data architecture.