Skip to main content

Guidance for Analyzing and Visualizing American Community Survey (ACS) Datasets

Overview

This Guidance shows how to build an Amazon QuickSight dashboard to visualize data from the 5-year American Community Survey (ACS). As the US Census Bureau’s largest household survey, the ACS samples 3.5 million addresses per year, and its data influences the distribution of $675 billion in federal funding annually. By building a complete analytics pipeline on AWS, your nonprofit can better analyze ACS data and visualize it through maps and infographics. With a more thorough understanding of the economic and social characteristics of your community, you can make data-driven decisions and create better programs and services to meet community needs.

How it works

These technical details feature an architecture diagram to illustrate how to effectively use this solution. The architecture diagram shows the key components and their interactions, providing an overview of the architecture's structure and functionality step-by-step.

Well-Architected Pillars

The architecture diagram above is an example of a Solution created with Well-Architected best practices in mind. To be fully Well-Architected, you should follow as many Well-Architected best practices as possible.

This Guidance provides automation and efficiency, supporting operational excellence by orchestrating systems, automating changes, and reducing operational overhead. As managed services, Lambda and AWS Glue enable you to run code without provisioning servers. This provides an automatic and scalable process for capturing and transforming ACS data. Additionally, AWS CloudFormation automates the deployment of AWS resources in the Guidance, and AWS Glue crawlers and AWS Glue Studio automate the extract, transform, load (ETL) process. Step Functions orchestrates the flow of different Lambda functions in this process.

Read the Operational Excellence whitepaper

Amazon S3 enables you to encrypt data and control access. Other services, such as Lambda and Athena, integrate seamlessly with AWS Identity and Access Management (IAM) to help secure the processing and querying by regulating access and permissions.

Read the Security whitepaper

Amazon S3 provides 99.999999999 percent data durability for cloud storage, minimizing the risk of data loss so that you can have dependable access to data insights. Lambda and AWS Glue are managed services and provide built-in fault tolerance and scalability. Additionally, the services used in this Guidance are highly available across multiple AZs, making the solution reliable and providing automatic recovery from failures.

Read the Reliability whitepaper

This Guidance uses services that enable fast processing, analysis, and visualization. Additionally, serverless services remove the need for you to run and maintain physical servers. CloudFormation automates and streamlines the provisioning of AWS services and resources, making it easier you to take advantage of the cloud even if you don’t have a technical background. Additionally, Lambda provides on-demand compute resources for data processing, and Step Functions help you efficiently set up Lambda functions to constitute the flow in the ETL process. Finally, Athena provides serverless querying, helping you enhance efficiency.

Read the Performance Efficiency whitepaper

As a nonprofit, your organization needs to be especially cost conscious to make sure it can properly fund its mission-critical work. This Guidance helps you avoid unnecessary costs and use resources efficiently. For example, Amazon S3 provides pay-as-you-go pricing—meaning you pay only for the services you consume—to align your costs with your needs, and QuickSight lets you pay per session. Additionally, Lambda, AWS Glue, and Athena use serverless implementation, scaling up or down as needed to help you avoid idle resource costs.

Read the Cost Optimization whitepaper

This Guidance dynamically allocates resources as needed to reduce energy waste from idle servers. For example, serverless services like Lambda automatically scale compute resources based on your workload demand, helping you avoid overprovisioning resources to reduce your energy consumption. Additionally, Amazon S3 optimizes resource usage through high storage density, enabling you to store more data with a smaller physical footprint, further minimizing your environmental impact.

Read the Sustainability whitepaper

Disclaimer

The sample code; software libraries; command line tools; proofs of concept; templates; or other related technology (including any of the foregoing that are provided by our personnel) is provided to you as AWS Content under the AWS Customer Agreement, or the relevant written agreement between you and AWS (whichever applies). You should not use this AWS Content in your production accounts, or on production or other critical data. You are responsible for testing, securing, and optimizing the AWS Content, such as sample code, as appropriate for production grade use based on your specific quality control practices and standards. Deploying AWS Content may incur AWS charges for creating or using AWS chargeable resources, such as running Amazon EC2 instances or using Amazon S3 storage.