This Guidance demonstrates how Nonprofits can build a single system of record for donor/member data from multiple disparate sources and build dashboards for trends leveraging AWS Data Lake and AI/ML services.
Architecture Diagram
Step 1
Donor and member data is collected from multiple data sources across the enterprise and software as a service (SaaS) applications. Data from fundraising and campaign management tools, registries, and subscriptions – primarily CRM and third party data – can also be collected.
Step 2
Based on the type of data source, AWS Database Migration Service (AWS DMS), AWS DataSync, and Amazon AppFlow are used to ingest the donor/member data into a data lake in AWS.
Step 3
AWS Data Exchange can be optionally used for integrating third-party data such as taxpayer status, high net worth status, and propensity to donate data into the data lake.
Step 4
AWS Lake Formation is used to build the scalable data lake for non-profits, and Amazon Simple Storage Service (Amazon S3) is used for data lake storage.
Step 5
AWS Lake Formation is also used to enable unified governance to centrally manage security, access control (table, row, column level security), and audit trails based on non-profits’ needs. It also enables automatic schema discovery and conversion to the required format.
Step 6
AWS Glue is used to extract, transform, catalog, and ingest data across multiple data stores. AWS Glue DataBrew can be used for visual data preparation such as donor insights. AWS Lambda can be used for enrichment and validation.
Step 7
Amazon QuickSight provides machine learning (ML)-powered business intelligence, such as donor/member dashboards. Amazon SageMaker and AWS AI services can be used to build, train, and deploy ML models, and add intelligence to applications. Amazon Athena enables interactive querying, analyzing, and processing capabilities.
Step 8
Unified donor/member profile information is stored into Amazon OpenSearch Service.
Step 9
Build a single donor/member profile view with the help of identity resolution data coming from Amazon Aurora.
Step 10
Amazon API Gateway providing exposing APIs as microservices
Step 11
Send the unified customer data for activation.
Step 12
Enrich the data sources. Based on destination type, AWS DMS, AWS AppFlow, and DataSync can be used to export.
Well-Architected Pillars
The AWS Well-Architected Framework helps you understand the pros and cons of the decisions you make when building systems in the cloud. The six pillars of the Framework allow you to learn architectural best practices for designing and operating reliable, secure, efficient, cost-effective, and sustainable systems. Using the AWS Well-Architected Tool, available at no charge in the AWS Management Console, you can review your workloads against these best practices by answering a set of questions for each pillar.
The architecture diagram above is an example of a Solution created with Well-Architected best practices in mind. To be fully Well-Architected, you should follow as many Well-Architected best practices as possible.
-
Operational Excellence
The Donor and Member Management Data Analytics Platform (DMMDAP) reference architecture is fully serverless. Your solution can be deployed with infrastructure as code and automation for fast iteration and consistent deployments. Use Amazon CloudWatch for application and Infrastructure monitoring.
-
Security
Use AWS Lake Formation for unified governance to centrally manage security, access control (at the table, row, and column security levels), and audit trails. Lake Formation also enables automatic schema discovery and conversion to required formats. API Gateway enforces policies that control security aspects such as authentication, authorization, and traffic management.
-
Reliability
Serverless architecture enables the solution to be automatically scalable, available, and deployed across all Availability Zones.
-
Performance Efficiency
By using serverless technologies, you only provision the exact resources you need. To maximize the performance of the DMMDAP solution, test with multiple instance types. Use API Gateway Edge endpoints for geographically-dispersed customers. Use Regional for regional customers (and when using other AWS services within the same Region).
-
Cost Optimization
By using serverless technologies and automatically scaling, you only pay for the resources you use. Serverless services don’t cost anything while they’re idle.
-
Sustainability
Minimize your environmental impact. Data lake uses processes to automatically move infrequently accessed data to cold storage with Amazon S3 lifecycle configurations. By extensively using managed services and dynamic scaling, this architecture minimizes the environmental impact of the backend services.
Implementation Resources
A detailed guide is provided to experiment and use within your AWS account. Each stage of building the Guidance, including deployment, usage, and cleanup, is examined to prepare it for deployment.
The sample code is a starting point. It is industry validated, prescriptive but not definitive, and a peek under the hood to help you begin.
Related Content
[Title]
Disclaimer
The sample code; software libraries; command line tools; proofs of concept; templates; or other related technology (including any of the foregoing that are provided by our personnel) is provided to you as AWS Content under the AWS Customer Agreement, or the relevant written agreement between you and AWS (whichever applies). You should not use this AWS Content in your production accounts, or on production or other critical data. You are responsible for testing, securing, and optimizing the AWS Content, such as sample code, as appropriate for production grade use based on your specific quality control practices and standards. Deploying AWS Content may incur AWS charges for creating or using AWS chargeable resources, such as running Amazon EC2 instances or using Amazon S3 storage.
References to third-party services or organizations in this Guidance do not imply an endorsement, sponsorship, or affiliation between Amazon or AWS and the third party. Guidance from AWS is a technical starting point, and you can customize your integration with third-party services when you deploy the architecture.