AWS for Industries

Tag: hpc

Optimizing HPC deployments with EC2 Fleet and IBM Spectrum LSF

Introduction High performance computing (HPC) workloads are becoming complex with the advent of big data, advanced node electronic design automation (EDA) for chip design, and high-precision verification. Enterprises are adopting Amazon Web Services (AWS) to meet the constantly growing compute demands in HPC. Worldwide HPC in the Cloud Forecast 2020–2026 from Hyperion Research (June 2022) […]

Recap of AWS re:Invent 2022 for the Automotive Industry

At AWS re:Invent 2022, which was held in Las Vegas, Nevada, from November 28 to December 2, AWS made several announcements relevant to the automotive industry. This blog post will summarize all the services and features from these announcements that are most relevant to the automotive industry. We will highlight AWS automotive and manufacturing industry […]

Evaluating SARS-CoV-2 binding affinities using Yasara simulations on AWS

Evaluating SARS-CoV-2 binding affinities using Yasara simulations on AWS

Blog guest authored by Dr. Vedat Durmaz of Innophore In 2021, Innophore, a structural bioinformatics startup company aiming at computational enzyme and drug discovery, started the virus.watch project in cooperation with the AWS Diagnostic Development Initiative. The principal goal of this project is the implementation of a monitoring and evaluation system for emerging drug and […]

Economics of EDA on AWS: License Cost Optimization

Introduction Electronic Design Automation (EDA) workloads have traditionally run on-premises on a combination of latest and older generation compute servers. The performance penalty of running Electronic Design Automation (EDA) on older generation hardware is often neglected in discussions and Total Cost of Ownership (TCO) models. With EDA license costs greatly exceeding IT spend in silicon […]

aws cyber event recovery architecture

On-demand seismic processing on AWS using GeoTomo’s technology

The seismic methods are the most common and effective ways for subsurface imaging to delineate and characterize oil and gas reservoirs. Seismic data are acquired in the field by deploying a seismic source (vibrator) that radiates elastic waves into the subsurface. The waves travel through the formations and are reflected back due to the variation […]

Using cloud-based, data-informed, power system models to engineer utility reliability

Grid simulations power various applications in the utility engineering value chain, from planning to operations. Driven by decarbonization, decentralization and digitization goals, utilities need these model-based simulations to be more detailed, accurate and faster than ever, while leveraging an increasing number of high-resolution grid measurements. This blog introduces some of the challenges associated with data-informed, […]

Building Elastic HPC Clusters for EDA with Altair on AWS

Introduction Chip design is not only compute and memory intensive, it also requires varying amounts of resource in different Integrated Circuit (IC) design phases: some frontend tools are single threaded and CPU bound, while backend tools rely on high performance storage and large memory. Fixed-size compute farms on-premises result in jobs waiting in queue for […]

BayerCLAW – Open-Source, Serverless Orchestrator for Scientific Workflows on AWS

Guest blog authored by Jack Tabaska and Ian Davis from the Bayer Crop Sciences team. At Bayer Crop Science we are applying modern genomic and data science methods to the challenges of global food production. Our research routinely produces enormous volumes of raw data that must be processed quickly and cost-effectively. Automated analysis pipelines (also […]

Deploying multi-physics simulations for biopharma process development on AWS

This blog was co-authored by Fabrice Schlegel, Senior Manager of Data Sciences at Amgen; Joao Alberto de Faria, Senior Associate Software Engineer at Amgen; Ammar Latif, Senior SA at AWS; and Pierre-Yves Aquilanti, Principal HPC Specialist SA at AWS. Amgen relies on computational modeling to gain insight into their biopharma processes. Modeling improves product designs through […]

Running GATK workflows on AWS: a user-friendly solution

This post was co-authored by Michael DeRan, Scientific Consultant at Diamond Age Data Science; Chris Friedline, Scientific Consultant at Diamond Age Data Science; Netsanet Gebremedhin, Scientific Consultant at Diamond Age Data Science (Computational Biologist at Parexel at time of publication); Jenna Lang, Specialist Solutions Architect at AWS; and Lee Pang, Principal Bioinformatics Architect at AWS.  […]