AWS HPC Blog
Tag: Research Computing
The benefits of computational chemistry for the circular economy
In this blog post, we’ll explore the benefits of computational chemistry for the circular economy, how it can help reduce waste, and describe the potential for new innovative materials.
Install optimized software with Spack configs for AWS ParallelCluster
Today, we’re announcing the availability of Spack configs for AWS ParallelCluster. You can use these configurations to install optimized HPC applications quickly and easily on your AWS-powered HPC clusters.
Deploying Open OnDemand with AWS ParallelCluster
In this post, we describe an integration of Open OnDemand with AWS ParallelCluster so admins can provide web-based access to HPC resources beyond what they have at their site, by using the AWS cloud to add new capabilities and extend capacity.
AWS ParallelCluster 3.3.0 now supports On-Demand Capacity Reservations
With #AWS #ParallelCluster 3.3, you can now easily take advantage of #EC2 On-Demand Capacity Reservations to help ensure your jobs have the capacity they need when they need it. This post describes the new feature and how you can benefit from it.
Building deep learning models for geoscience using MATLAB and NVIDIA GPUs on Amazon EC2 (Part 2 of 2)
This is the second of a two-part post.Part 1 discussed the workflow for developing AI models using MATLAB for seismic interpretation. Today, we will discuss the various compute resources leveraged from AWS and NVIDIA for developing the models.
Building deep learning models for geoscience using MATLAB and NVIDIA GPUs on Amazon EC2 (Part 1 of 2)
In this blog post, we discuss how geoscientists can use shallow RNN-based algorithms with MATLAB to automatically recognize distinct geologic features in seismic images. We discuss the workflow for developing the AI models using MATLAB for seismic interpretation. In a second post will introduce the various compute resources leveraged from AWS and NVIDIA for developing the models.
Bridging research and HPC to tackle grand challenges
Today we announced the AWS Impact Computing Project at the Harvard Data Science Initiative (HDSI) to identify potential solutions that can improve the lives of humans, other species, and natural ecosystems. Deb Goldfarb describes its goals and our joint vision.
Analyzing Genomic Data using Amazon Genomics CLI and Amazon SageMaker
In this blog post, we demonstrate how to leverage the AWS Genomics Command line and Amazon SageMaker to analyze large-scale exome sequences and derive meaningful insights. We use the bioinformatics workflow manager Nextflow, it’s open source library of pipelines, NF-Core, and AWS Batch.
How Thermo Fisher Scientific Accelerated Cryo-EM using AWS ParallelCluster
In this blog post, we’ll walk you through the process of building a successful Cryo-EM benchmarking pilot using AWS ParallelCluster, Amazon FSx for Lustre, and cryoSPARC (from Structura Biotechnology) and explain some of our design decisions along the way.
Call for participation: PRACE Winter School
The Inter University Computing Centre (IUCC) in Israel and AWS have joined forces to train Researchers and Research Software Engineers (RSEs) in the use of AWS for High Performance Computing (HPC) at the PRACE Winter School, 7-9 December 2021, and we’re calling for interested groups to sign up and join us.