AWS Public Sector Blog
Tag: geospatial data
39 new or updated datasets available on the Registry of Open Data on AWS
The AWS Open Data Sponsorship Program makes high-value, cloud-optimized datasets publicly available on Amazon Web Services (AWS). AWS works with data providers to democratize access to data by making it available to the public for analysis on AWS; develop new cloud-based techniques, formats, and tools that lower the cost of working with data; and encourage the development of communities that benefit from access to shared datasets. Through this program, customers are making over 100 petabytes (PB) of high-value, cloud-optimized data available for public use. This quarter, AWS released 39 new or updated datasets.
21 new or updated datasets available on the Registry of Open Data on AWS
The Amazon Web Services (AWS) Open Data Sponsorship Program makes high-value, cloud-optimized datasets publicly available on AWS. Through this program, customers are making more than 100 petabytes (PB) of high-value, cloud-optimized data available for public use. This past quarter, AWS released 21 new or updated datasets. What will you build with these datasets?
34 new or updated datasets available on the Registry of Open Data on AWS
The Amazon Web Services (AWS) Open Data Sponsorship Program makes high-value, cloud-optimized datasets publicly available on AWS. Through this program, customers are making more than 100 petabytes (PB) of high-value, cloud-optimized data available for public use. Read this blog post to learn about the 34 new or updated datasets that were released in the first quarter.
SeloVerde uses geospatial big data and AI/ML to monitor deforestation in supply chains, powered by AWS
Open source geospatial artificial intelligence (AI) and machine learning (ML) analyses along with Internet of Things (IoT)-connected sensors can power near real-time data built on the cloud and assist in decision-making. Read this blog post to learn how Amazon Web Services (AWS) is supporting the Government of Pará, Brazil, in designing and deploying SeloVerde (Green Seal), a cutting-edge tool to address climate change challenges and traceability in deforestation-risk supply chains.
34 new or updated datasets available on the Registry of Open Data on AWS
This quarter, AWS released 34 new or updated datasets on the Register of Open Data. What will you build with these datasets? Read through this blog post for inspiration.
Alteia and the World Bank assess and enhance road infrastructure data quality at scale using AWS
Read this blog post to learn how the World Bank assesses road infrastructure faster and at less cost by using Alteia data analytics powered by Amazon Web Services (AWS), geospatial imagery, and satellite imagery available on the Registry of Open Data on AWS.
How Government of Canada customers can use AWS to securely migrate data
Learn how AWS Snowcone and Amazon S3 can help Government of Canada (GC) organizations securely transfer and store their data, and how two GC organizations have already used these services to migrate data securely. Find out how these AWS services address data security, privacy, and compliance with regulatory requirements specific to GC customers.
How to store historical geospatial data in AWS for quick retrieval
Learn how to store historical geospatial data, such as weather data, on AWS using Amazon DynamoDB. This approach allows for virtually unlimited amounts of data storage combined with query performance fast enough to support an interactive UI. This approach can also filter by date or by location, and enables time- and cost- efficient querying.
36 new or updated datasets on the Registry of Open Data: AI analysis-ready datasets and more
This quarter, AWS released 36 new or updated datasets. As July 16 is Artificial Intelligence (AI) Appreciation Day, the AWS Open Data team is highlighting three unique datasets that are analysis-ready for AI. What will you build with these datasets?
Decrease geospatial query latency from minutes to seconds using Zarr on Amazon S3
Geospatial data, including many climate and weather datasets, are often released by government and nonprofit organizations in compressed file formats such as the Network Common Data Form (NetCDF) or GRIdded Binary (GRIB). As the complexity and size of geospatial datasets continue to grow, it is more time- and cost-efficient to leave the files in one place, virtually query the data, and download only the subset that is needed locally. Unlike legacy file formats, the cloud-native Zarr format is designed for virtual and efficient access to compressed chunks of data saved in a central location such as Amazon S3. In this walkthrough, learn how to convert NetCDF datasets to Zarr using an Amazon SageMaker notebook and an AWS Fargate cluster and query the resulting Zarr store, reducing the time required for time series queries from minutes to seconds.