AWS Public Sector Blog

Tag: research

Data egress waiver available for eligible researchers and institutions

The Global Data Egress Waiver (GDEW) program helps eligible researchers and academic institutions use AWS cloud storage, computing, and database services by waiving data egress fees. GDEW can be a valuable tool that gives eligible researchers and institutions a more predictable budget, which in turns allows them to have more direct access to the cloud than they might otherwise. Find out if your team is eligible to take advantage of the data egress waiver program.

Read More

Driving innovation in single-cell analysis on AWS

Computational biology is undergoing a revolution. However, the analysis of single cells is a hard problem to solve. Standard statistical techniques used in genomic analysis fail to capture the complexity present in single-cell datasets. Open Problems in Single-Cell Analysis is a community-driven effort using AWS to drive the development of novel methods that leverage the power of single-cell data.

Read More
students collaborating over a laptop in a university library

Paris-Saclay University uses AWS to advance data science through collaborative challenges

This is a guest post by Maria Teleńczuk, research engineer at the Paris-Saclay Center for Data Science (CDS), and Alexandre Gramfort, senior research scientist at INRIA, the French National Institute for Research in Digital Science and Technology. Maria and Alexandre explain how they adapted their open source data challenge platform RAMP to train the models submitted by student challenge participants using Amazon Elastic Compute Cloud (Amazon EC2) Spot instances, and how they leveraged AWS to support three student challenges.

Read More
close up of laptop showing tracking of sharks via OCEARCH along the US Carolina coastline

Assessing the ocean’s health by monitoring shark populations

OCEARCH is a data-centric organization built to help scientists collect previously unattainable data about the ocean. Their mission is to accelerate the ocean’s return to balance and abundance, through innovation in scientific research, education, outreach, and policy, using unique collaborations of individuals and organizations in the US and abroad. As part of the Amazon Sustainability Data Initiative (ASDI), we invited Fernanda Ubatuba, president and COO at OCEARCH, to share how her organization is making strides in helping ocean conservation and how AWS is supporting her mission.

Read More

Using big data to help governments make better policy decisions

In Europe, government agencies and policy makers see the value in using new technology to unlock digital transformation and deliver better, more innovative citizen services. Using data for statistics initiatives, including open data, can help researchers produce innovative products and tools, including visualisation, to inform government officials ahead of making policy decisions that impact their citizens. When it comes to big data, policy makers need to collaborate with researchers to address issues and challenges in using these new data sources. To work toward this goal, Eurostat, the statistical office of the EU, hosted its bi-annual European Big Data Hackathon.

Read More
human genome

Accelerating genome assembly with AWS Graviton2

One of the biggest scientific achievements of the twenty-first century was the completion of the Human Genome Project and the publication of a draft human genome. The project took over 13 years to complete and remains one of the largest private-public international collaborations ever. Advances since in sequencing technologies, computational hardware, and novel algorithms reduced the time it takes to produce a human genome assembly to only a few days, at a fraction of the cost. This made using the human genome draft for precision and personalized medicine more achievable. In this blog, we demonstrate how to do a genome assembly in the cloud in a cost-efficient manner using ARM-based AWS Graviton2 instances.

Read More
Lake Michigan lighthouse

Modeling clouds in the cloud for air pollution planning: 3 tips from LADCO on using HPC

In the spring of 2019, environmental modelers at the Lake Michigan Air Directors Consortium (LADCO) had a new problem to solve. Emerging research on air pollution along the shores of the Great Lakes in the United States showed that to properly simulate the pollution episodes in the region we needed to apply our models at a finer spatial granularity than the computational capacity of our in-house HPC cluster could handle. The LADCO modelers turned to AWS ParallelCluster to access the HPC resources needed to do this modeling faster and scale for our member states.

Read More
RONIN with one AWS account

Inside a self-service cloud research computing platform: How RONIN is built on AWS

RONIN is an AWS Partner solution that empowers researchers with a simple interface to create and control computing resources, set and monitor budgets, and forecast spend. RONIN is designed and architected to advance research institutions’ missions, by providing a research platform that manages the most common research use cases, and is also compatible with advanced cloud computing services from AWS. Learn what powers RONIN underneath the user-friendly interface.

Read More

Building the first quantum computing applications lab in India

The Ministry of Electronics and Information Technology (MeitY) in India is establishing a Quantum Computing Applications Lab, in collaboration with AWS, to accelerate quantum computing-led research and development and enable new scientific discoveries. This is MeitY’s first initiative in the country to provide scientific, academic, and developer communities with access to a quantum computing development environment in the cloud. This is also the first quantum computing applications lab on AWS Cloud to support a government’s science and technology mission at a country level.

Read More
Sharing SAS data with Athena and ODBC

Sharing SAS data with Athena and ODBC

If you share data with other researchers, especially if they are using a different tool, you can quickly run into version issues, not knowing which file is the most current. Rather than sending data files everywhere, AWS offers a simple way to store your data in one central location so that you can read your data into SAS and still share it with other colleagues. In this blog post, I will explain how to export your data, store it in AWS, and query the data using SAS.

Read More