AWS Public Sector Blog
Category: Amazon Simple Storage Service (S3)
UC Davis CWEE accelerates water conservation research with secure, compliant data storage on AWS
To solve some of the most pressing water and energy challenges, scientists and engineers need access to robust, reliable data that is often sensitive and protected. Data providers, researchers, and host institutions need to adhere to strict requirements for protecting and securing this data. The Center for Water-Energy Efficiency (CWEE) at the University of California, Davis (UC Davis) used AWS to create a centralized, secure data repository that streamlines data sharing.
How to put a supercomputer in the hands of every scientist
The AWS Cloud gives you access to virtually unlimited infrastructure suitable for high performance computing (HPC) workloads. With HPC, you can remove long queues and waiting times so you don’t have to choose availability over performance. In this technical guide, learn how to use AWS ParallelCluster to set up and manage an HPC cluster in a flexible, elastic, and repeatable way.
How the cloud can help educational institutions with grading, assessments, and admissions
During the COVID-19 pandemic, educational institutions that operated on in-person model shifted many of their traditionally in-person operations and activities—including grading, assessments and testing, and admissions—to a virtual format, where many had never been before. Educational technology (EdTech) companies around the world used the cloud to help quickly create and scale to meet the needs of these academic institutions while maintaining a consistent and smooth student experience.
How to build secure data lakes to accelerate your nonprofit’s mission
Using data lakes, nonprofits can use data to influence strategy and inform decisions that produce value and impact. In this post, learn how to build a data lake, ingest data from a PostgreSQL server, give permissions to users to consume the data using AWS Lake Formation, and access and analyze the data using Amazon Athena.
Library and Archives Canada helps better preserve Canadian history by embracing the cloud
Canada’s history is rich, but not without its scars. The need for documentation and analysis has never been greater. Library and Archives Canada (LAC) is the custodian of Canada’s distant past and recent history—and Amazon Web Services (AWS) is helping expand its reach.
Edunation scales up to 32 times activity by boosting infrastructure with AWS
Using AWS, Edunation seamlessly responded to increasing demand during the COVID-19 pandemic. Edunation collaborates with top educational institutions across the Middle East and North Africa (MENA) region and provides all-in-one learning and school management solutions. Today, the EdTech is on a mission to push learning management systems (LMS) beyond virtual classrooms.
Beth Israel Lahey Health builds COVID-19 vaccine deployment system in two weeks with AWS
13 hospitals, over 100 primary care and ambulatory sites, over 30,000 staff members — and only two weeks to get them all prioritized and scheduled for vaccination in time for when vaccine doses became available. Discover how Beth Israel Lahey Health built and launched a COVID-19 vaccine deployment solution for healthcare workers in just two weeks with Amazon Web Services.
Automated Earth observation using AWS Ground Station Amazon S3 data delivery
With AWS Ground Station, you can now deliver data directly into Amazon S3 buckets. This simplifies downlinking because you no longer need to run an Amazon EC2 receiver instance. It also saves you cost and simplifies the creation of automated processing pipelines like the one we are going to show in this blog. By using an automated Earth observation (EO) pipeline, you can reduce the operating burden of your staff, as after scheduling a contact, everything is handled automatically and you’ll get a notification when the processed data is available. Read on to learn how to create an automated EO pipeline that receives and processes data from the NOAA-20 (JPSS-1) satellite, using this new AWS Ground Station feature.
How to meet business data resiliency with Amazon S3 cross-Region replication
Even though Amazon S3 provides regional data resiliency, customers often have compliance and business requirements to replicate their data to a second Region that is hundreds (or even thousands) of miles away from their primary location. Amazon S3 replication provides an automatic mechanism to make identical copies of your objects in a destination Region of your choice. Replication enables automatic, asynchronous copying of objects across S3 buckets. Learn how to configure S3 Cross Region Replication with S3 RTC feature, and do a walk-through of how to configure event notification for S3 replication events and configuring Amazon CloudWatch alarms for the replication metrics.
Serverless GIS with Amazon S3, open data, and ArcGIS
If you are hosting an ArcGIS web app today, then you are probably hosting it on a Windows or Linux server using traditional web server software like IIS or Apache. With the web hosting capability of Amazon S3 you can remove the need to run these servers and the maintenance, management, and monitoring overhead that comes with it. Serverless services like Amazon S3 can scale automatically and can be as simple as copying over your website assets to get up and running in minutes. This blog focuses on web app implementations using ArcGIS API for JavaScript (as other ArcGIS web apps have additional considerations).