AWS Government, Education, & Nonprofits Blog

Tag: open data

Keeping a SpatioTemporal Asset Catalog (STAC) Up To Date with SNS/SQS

The SpatioTemporal Asset Catalog (STAC) specification aims to standardize the way geospatial assets are exposed online and queried. The China-Brazil Earth Resources Satellites (CBERS) are the result of a cooperation agreement between Brazilian and Chinese space agencies (INPE and CAST, respectively), which started in 1988. Since then, five satellites were launched (CBERS-1/2/2A/3/4). The mission generates images from Earth with characteristics similar to USGS’ Landsat and ESA’s Sentinel-2 missions. In 2004, INPE announced that all CBERS-2 images would be available at no charge to the public. It was the first time this distribution model was used for medium-resolution satellite imagery. Now, this model is used for all CBERS satellite images.

Read More

Best Practices in Ethical Data-Sharing: An Interview with Natalie Evans Harris

The AWS Institute interviewed Natalie Evans Harris, co-founder and CEO of BrightHive and former senior policy advisor to the US Chief Technology Officer in the Obama administration. Harris founded the Data Cabinet, a federal data science community of practice with over 200 active members across more than 40 federal agencies, co-led a cohort of federal, nonprofit, and for-profit organizations to develop data-driven tools through the Opportunity Project, and established the Open Skills Community through the Workforce Data Initiative. She also led an analytics development center for the National Security Agency (NSA) that served as the foundation for NSA’s Enterprise Data Science Development program and became a model for other intelligence community agencies.

Read More

The Amazon Sustainability Data Initiative: Driving Sustainability Innovation with Open Data and Cloud Technology

Amazon today announced the Amazon Sustainability Data Initiative to promote sustainability research, innovation, and problem solving by making key data easily accessible and even more widely available. The Amazon Sustainability Data Initiative leverages Amazon Web Services’ technology and scalable infrastructure to stage, analyze, and distribute data, and is a joint effort between the AWS Open Data and Amazon Sustainability teams.

Read More

StormSense: Automated Flood Alerts Using Integrated Real-Time IoT Sensors

Coastal communities in the Southern United States are frequently impacted by flooding from storm surge, rain, and tides. To help monitor and enhance flood emergency preparedness, the Virginia Institute of Marine Science (VIMS) at the College of William & Mary has been providing tidal forecasts since 2012 for a dozen locations in the lower Chesapeake Bay through its VIMS TideWatch Network. To expand and enhance these capabilities along Virginia’s seaside Eastern Shore, VIMS developed StormSense. The StormSense project works closely with coastal local governments leveraging a network of Internet of Things (IoT)-enabled water level sensors, VIMS’s hydrodynamic flood modeling and forecasting capabilities, and the VIMS TideWatch Network to improve flood resilience in the region.

Read More

How to Share Data (Hint: “Thoughtfully”)

Sharing data requires more than just making it available for download or creating an API to access it. In many ways, sharing data is similar to shipping a software product. Just like software; data is made up of digital information; it requires documentation; it will be used by groups of users who may require support; and it may become vital to those users’ work. Another common characteristic of software is that it often gets updated over time as software developers learn from their users and adapt to new technologies.

Read More

Why Share Data?

As open data policies become commonplace, it is worth examining the history and value of open data, and discuss why we share it in the cloud. The idea of sharing data dates back at least to the 1950s, when the International Council of Scientific Unions established World Data Centers to facilitate sharing of data among scientists. In recent years, governments have created open data policies that require government agencies to share data with the public.

Read More

Estimating Hurricane Wind Speeds with Machine Learning

Better estimates of hurricane wind speeds can lead to better decisions around evacuations and general hurricane response planning, saving both lives and property. Hurricane windspeed estimates are currently made using the manual Dvorak technique. The National Hurricane Center releases them every three to six hours. Artificial intelligence (AI) experts with the IMPACT team at NASA’s Marshall Space Flight Center and Development Seed created the Deep Learning-Based Hurricane Intensity Estimator to automate this process.

Read More

The ERA5 Reanalysis Dataset Provides a Sharper View on Past Weather

Reanalysis is the term for using modern-day technology to analyze weather data from the past. By doing so, meteorologists and climatologists can produce a more accurate analysis of previous weather conditions, which is important for climate change research. The European Centre of Medium Range Forecasts (ECMWF) is producing its latest reanalysis dataset, called ERA5. Recently, Chris Kalima and his team at Intertrust, in conjunction with the AWS Public Datasets Program, have been working to bring the ERA5 data to AWS.

Read More