AWS Public Sector Blog
Tag: big data
Preventing the next pandemic: How researchers analyze millions of genomic datasets with AWS
How do we avoid the next global pandemic? For researchers collaborating with the University of British Columbia Cloud Innovation Center (UBC CIC), the answer to that question lies in a massive library of genetic sequencing data. But there is a problem: the data library is so massive that traditional computing can’t comprehensively analyze or process it. So the UBC CIC team collaborated with computational virologists to create Serratus, an open-science viral discovery platform to transform the field of genomics—built on the massive computational power of the Amazon Web Services (AWS) Cloud.
Analyze terabyte-scale geospatial datasets with Dask and Jupyter on AWS
Terabytes of Earth Observation (EO) data are collected each day, quickly leading to petabyte-scale datasets. By bringing these datasets to the cloud, users can use the compute and analytics resources of the cloud to reliably scale with growing needs. In this post, we show you how to set up a Pangeo solution with Kubernetes, Dask, and Jupyter notebooks step-by-step on Amazon Web Services (AWS), to automatically scale cloud compute resources and parallelize workloads across multiple Dask worker nodes.
How MTI tracks social distancing efforts with the AWS Cloud and big data
Maryland Transportation Institute (MTI), an interdisciplinary research and education organization based out of the University of Maryland, focuses on solving complex transportation problems. When COVID-19 hit, MTI was presented with an urgent new problem: the organization was tasked with gathering, processing, and reporting daily transportation data from nearly 65% of the US population. To keep the public safe, they needed more computing power—quickly. They used the AWS Cloud.
Using big data to help governments make better policy decisions
In Europe, government agencies and policy makers see the value in using new technology to unlock digital transformation and deliver better, more innovative citizen services. Using data for statistics initiatives, including open data, can help researchers produce innovative products and tools, including visualisation, to inform government officials ahead of making policy decisions that impact their citizens. When it comes to big data, policy makers need to collaborate with researchers to address issues and challenges in using these new data sources. To work toward this goal, Eurostat, the statistical office of the EU, hosted its bi-annual European Big Data Hackathon.
Open data helps recovery in the aftermath of devastating weather events
Severe and extreme weather events not only wreak havoc on lives, property and the economy, but the extent of the destruction and devastation left behind can be difficult to map and quantify. Having high resolution imagery of areas devastated by weather events (hurricanes, tornadoes, floods, and etc.) helps to characterize impacts, formulate needed recovery and response activities, support emergency managers in saving lives, and restart the flow of commerce. NOAA’s data plays a critical role in this process. As part of ASDI, we invited Jena Kent from NOAA’s Big Data Program to share how AWS is helping with disaster response by providing access to aerial data and imagery through open data initiatives.
BDP now supports securing, monitoring, and reporting on AWS GovCloud (US)
Enlighten IT Consulting, an AWS Managed Service Partner, is the prime integrator for the Department of Defense (DoD) Big Data Platform (BDP). BDP is a fully accredited government-owned platform used for storing, querying, and analyzing cyber data that runs at petabyte scale in AWS GovCloud (US). The BDP has been adopted by multiple defense and federal civilian agencies as the primary security information event management solution (SIEM) due to its ability to support petabyte-scale data ingest, processing, storage, and visualization without the burden of licensing cost or lock-in.
September 2018 Top Blog Roundup
September was monumental for the blog. Thanks to the continuous innovation of our customers, the blog topped 500 posts since launch! Here are the top September stories that helped us get there.
Analytics Without Limits: FINRA’s Scalable and Secure Big Data Architecture – Part 1
A guest post by John Brady, CISSP, VP Cyber Security/CISO, Financial Industry Regulatory Authority The Financial Industry Regulatory Authority (FINRA) oversees more than 3,900 securities firms with approximately 640,000 brokers. Every day, we watch over nearly 6 billion shares traded in U.S. equities markets—using technology powerful enough to help detect fraud, abuse, and insider trading. In […]
Data Lakes for HHS: Unlocking Data to Gain New Insight
Health and Human Services (HHS) agencies collect endless amounts of data that, however abundant, often fails to paint the whole picture. Now, through the use of data lakes, healthcare agencies across the world are connecting disparate datasets, connecting information, and drawing new insights from years of data. HHS is in the midst of a digital […]
Amazon Web Services and the National Science Foundation Spur Innovation in Big Data Research
The AWS Research Initiative (ARI) brings Amazon Web Services (AWS) and the National Science Foundation (NSF) together to spur innovation in Big Data research. Under the program on Critical Techniques, Technologies and Methodologies for Advancing Foundations and Applications of Big Data Sciences and Engineering (BIGDATA) a total of $26.5 million will be funded by NSF […]