This Guidance demonstrates how Nonprofits can personalize experiences across the fundraising lifecycle with donors such as a) right amount to solicit, b) right frequency of engagement, c) right channel, leveraging AWS data lake and other AWS services.
Data is collected from multiple sources: Online (web activity coming from websites, social media platforms, emails, and online campaigns), and offline: (purchase history and subscriptions – primarily CRM and third-party data).
Based on the data source, AWS Database Migration Service (AWS DMS), AWS DataSync, Amazon Kinesis, and Amazon AppFlow are used to ingest the data into a data lake in AWS.
AWS Lake Formation is used to build the scalable data lake, and Amazon Simple Storage Service (Amazon S3) is used for its storage. Lake Formation is also used to enable unified governance to centrally manage security, access control (table, row and column level security), and audit trails. Lake Formation also enables automatic schema discovery and conversion to required format.
AWS Glue is used to extract, transform, catalog, and ingest data across multiple data stores. AWS Lambda is used for enrichment and validation.
AWS ML services like Amazon Personalize and Amazon Forecast process the enriched data sets in Amazon S3. Amazon Personalize can be used to build an outreach channel personalization model. The output will contain data on recommendations for right donation amount, frequency, cause, and the outreach channel.
AWS Lambda is used to extract personalized data sets and forecasted donations from Amazon Personalize and Forecast.
Amazon Connect is used to make personalized outbound calls to donors/members based on the generated data. Amazon Pinpoint is used to create campaigns and perform personalized outreaches via SMS and email to donors/members based on the data generated by Amazon Personalize and Forecast.
Amazon QuickSight provides ML-powered business intelligence visualizations and dashboards that show key metrics such as donation impacts, fundraising goals achievement, and more. Amazon Athena enables interactive querying, analyzing, and processing capabilities.
The AWS Well-Architected Framework helps you understand the pros and cons of the decisions you make when building systems in the cloud. The six pillars of the Framework allow you to learn architectural best practices for designing and operating reliable, secure, efficient, cost-effective, and sustainable systems. Using the AWS Well-Architected Tool, available at no charge in the AWS Management Console, you can review your workloads against these best practices by answering a set of questions for each pillar.
The architecture diagram above is an example of a Solution created with Well-Architected best practices in mind. To be fully Well-Architected, you should follow as many Well-Architected best practices as possible.
The Personalized Engagement with Donors/Members Platform (PEDMP) reference architecture is using fully-managed AWS services. The solution can be deployed with infrastructure as code and automation for fast iteration and consistent deployments. The platform can be monitored using Amazon CloudWatch and audited using AWS CloudTrail logs.
Use Lake Formation for unified governance to centrally manage security, access control (at the table, row, and column security levels), and audit trails for the data. Lake Formation also enables automatic schema discovery and conversion to required formats.
By leveraging fully-managed AWS services in this architecture, the solution is automatically scalable, available, and deployed across all Availability Zones.
By using fully-managed AWS services, you only provision and utilize the resources you need. To maximize the performance of the PEDMP solution, run periodic stress and performance tests with varying volumes of data to test the system performance.
By using fully-managed AWS services that scale automatically based on demand, you only pay for the resources you use. With QuickSight's pay-per-use pricing and serverless architecture, this platform delivers increased value to the business, faster and at lower costs compared to other competitive visualization tools/services.
Minimize your environmental impact. The data lake uses processes to automatically move infrequently accessed data to cold storage with Amazon S3 lifecycle configurations. By using fully-managed AWS services and dynamic scaling, this architecture minimizes the environmental impact of the backend services.
Start building with this sample code. Learn how to automate business processes which presently rely on manual input and intervention across various file types and formats.
The sample code; software libraries; command line tools; proofs of concept; templates; or other related technology (including any of the foregoing that are provided by our personnel) is provided to you as AWS Content under the AWS Customer Agreement, or the relevant written agreement between you and AWS (whichever applies). You should not use this AWS Content in your production accounts, or on production or other critical data. You are responsible for testing, securing, and optimizing the AWS Content, such as sample code, as appropriate for production grade use based on your specific quality control practices and standards. Deploying AWS Content may incur AWS charges for creating or using AWS chargeable resources, such as running Amazon EC2 instances or using Amazon S3 storage.