Tag: technical how-to
Data lakes are becoming increasingly common in many different workloads, and geospatial is no exception. In 2021, Amazon Web Services (AWS) announced geography and geohash support on Amazon Redshift, so geospatial analysts have the capability to quickly and efficiently query geohashed vector data in Amazon Simple Storage Service (Amazon S3). In this blog post, I walk through how to use geohashing with Amazon Redshift partitioning for quick and efficient geospatial data access, analysis, and transformation in your data lake.
To mitigate synthetic fraud, government agencies should consider complementing their rules-based improper payment detection systems with machine learning (ML) techniques. By using ML on a large number of disparate but related data sources, including social media, agencies can formulate a more comprehensive risk score for each individual or transaction to help investigators identify improper payments efficiently. In this blog post, we provide a foundational reference architecture for an ML-powered improper payment detection solution using AWS ML services.
In a data-dependent world, success belongs to the side with decision advantage: the ability to acquire data and make sense of a complex and adaptive environment, and act smarter and faster than the competition. Understanding global environments requires more than just more data – it requires live two- and three-dimensional maps, new support tools, improved processes, seamless connectivity, and better collaboration that can scale to the needs of the environment. This blog post explores how to address challenges of big data and accelerate time to data insights with machine learning with AWS Snowball Edge device deployment at the edge.
Increasingly, AWS customers are operating workloads both in AWS GovCloud (US) and standard AWS Regions. Dependencies between workloads, changing data controls, or enrichment of data across multiple data levels are examples of business needs that may require moving data in and out of AWS GovCloud (US). In this blog post, I explain how to move data between Amazon Simple Storage Service (Amazon S3) buckets in the AWS GovCloud (US) and standard partitions.
As public sector customers find increasing need to move data between the AWS GovCloud (US) partition and the standard partition, they need tools to help them lower their operational burden. In this blog post, I walk through how to use AWS DataSync to move data on network file system (NFS) shares between the two partitions.
In a recent disaster response field testing exercise (FTX), the AWS Global Social Impact Solutions (GSI) team developed a prototype cloud architecture and tested it in a search and rescue (SAR) scenario simulating a missing responder crisis. This blog post walks through the SAR simulation and result, and provides an overview of the AWS services and technical architecture components the GSI team used to provide a hybrid edge/cloud COP solution that helped locate the missing team member in the simulated scenario within 20 minutes.
In this blog post, we explain how government agencies can accelerate their development workflows while maintaining strict application and operational security using the principles of continuous integration and continuous delivery (CI/CD) and DevSecOps. We provide a solution to walk you through how you can quickly set up your own DevSecOps pipeline that incorporates AWS and third-party security tools to give you a fast, flexible, and secure software delivery process.
Geospatial datasets are increasingly large, reaching terabyte and even petabyte scale, which can cause many challenges for geospatial analysts and educators–but Amazon AppStream 2.0 can provide some solutions. In this blog post, we walk through how to deploy QGIS, a no cost, open-source geospatial information system (GIS) application used by geospatial analysts, in Amazon AppStream 2.0. We also load an example dataset to demonstrate how desktop GIS application users can access large, cloud-hosted geospatial datasets with high performance by keeping the data and compute components together on the cloud, and streaming the desktop application instead of downloading the data itself.
Galaxy is a scientific workflow, data integration, and digital preservation platform that aims to make computational biology accessible to research scientists that do not have computer programming or systems administration experience. Although it was initially developed for genomics research, it is largely domain agnostic and is now used as a general bioinformatics workflow management system, running on everything from academic mainframes to personal computers. But researchers and organizations may worry about capacity and the accessibility of compute power for those with limited or restrictive budgets. In this blog post, we explain how to implement Galaxy on the cloud at a predictable cost within your research or grant budget with Amazon Lightsail.
Many nonprofits and other tax-exempt organizations need to make sure their tax status is correct across their Amazon Web Services (AWS) accounts. A new tax analyzer solution automatically detects the tax status of all AWS accounts across an organization. In this blog post, discover how this simple solution identifies which AWS accounts across an organization are paying sales tax, and learn how this solution can quickly remediate tax status by opening an AWS support case automatically.