Personalize nonprofit direct mail campaigns to increase response rates and engage new donors
This Guidance helps nonprofit organizations use personalization in direct mailing efforts to improve the member experience. Personalization helps ensure any calls to action are timely and tailored to the right audience, which can increase overall return on investment for a nonprofit organization. The services in this Guidance orchestrate an automated workflow, through which the program administrator (or person responsible for direct mailings) can pull the necessary member data, leverage a personalization model powered by machine learning (ML), and generate customized mailings for members.
Please note: [Disclaimer]
Architecture Diagram
[Architecture diagram description]
Step 1
The program administrator collects member data into a data table from multiple data sources across the enterprise, including customer relationship management and payment systems. Member interaction data from prior programs and online content provides the data needed to build an ML model.
Step 2
The program administrator uploads the table of collected data to an upload utility hosted in AWS Amplify.
Step 3
The upload utility authenticates the program administrator’s credentials using Amazon Cognito, which returns an AWS Identity and Access Management (IAM) role to access AWS resources.
Step 4
The upload utility saves the data table to an Amazon Simple Storage Service (Amazon S3) bucket.
Step 5
Adding the data table to the S3 bucket generates an event, which is used to automatically start an AWS Step Functions workflow. The Step Functions workflow extracts data from the table and uses it to build an ML model with Amazon Personalize. Generated predictions are added back into the table.
Step 6
The updated data table is saved to Amazon S3. Amazon Simple Notification Service (Amazon SNS) sends an email to the program administrator to alert them that the new file is available. The email includes a presigned URL, which lets the program administrator easily download the new table. The new table can be used in a mail merge or sent to a third party to generate customized direct mail pieces.
Well-Architected Pillars
The AWS Well-Architected Framework helps you understand the pros and cons of the decisions you make when building systems in the cloud. The six pillars of the Framework allow you to learn architectural best practices for designing and operating reliable, secure, efficient, cost-effective, and sustainable systems. Using the AWS Well-Architected Tool, available at no charge in the AWS Management Console, you can review your workloads against these best practices by answering a set of questions for each pillar.
The architecture diagram above is an example of a Solution created with Well-Architected best practices in mind. To be fully Well-Architected, you should follow as many Well-Architected best practices as possible.
-
Operational Excellence
AWS CloudFormation allows you to deploy this Guidance as infrastructure as code. You can automate this Guidance for fast iteration and consistent deployments, and you can modify the CloudFormation template to best suit your needs and workload requirements. Additionally, Step Functions provides a reliable way to coordinate between the preprocessing and postprocessing steps so you can examine each step and automate the manual tasks of cleaning and normalizing data.
-
Security
This Guidance uses Amazon Cognito to provide secure, federated user authentication so you can be sure that any users with access to member data are authenticated. You can use Amazon Cognito with your existing user directories or third-party identity providers. This Guidance lets you set up IAM roles and policies to limit access by the least privilege needed to implement the Guidance workflow. Additionally, Amazon S3 encrypts your data at rest using keys that you can manage.
-
Reliability
This Guidance uses a serverless architecture that is automatically scalable and available to enhance reliability. All services used in this architecture span multiple Availability Zones within an AWS Region. Additionally, AWS Amplify Hosting and Amazon Personalize are managed services, so you don’t have to create virtual private clouds or compute resources. Managed services also handle updates and scale to meet demand.
-
Performance Efficiency
The resources used in this Guidance are only active during the time of data processing. Using Amazon Personalize in batch mode helps you save on costs and running time. Additionally, AWS Lambda and Amplify only consume resources when they are being actively used to process a workflow.
-
Cost Optimization
This Guidance uses serverless technologies that can automatically scale, so you only pay for the resources you use and don’t need to pay for idle infrastructure. Additionally, Amazon Personalize is purpose-built for generating recommendations, so you can make the predictions without needing to pay for a custom ML model.
-
Sustainability
This Guidance uses managed services and dynamic scaling to help ensure that resources are only used as needed, minimizing environmental impact. For example, managed services like Amazon Personalize and Amplify Hosting don’t require you to provision cloud services, and they scale down automatically when not in use.
Implementation Resources
A detailed guide is provided to experiment and use within your AWS account. Each stage of building the Guidance, including deployment, usage, and cleanup, is examined to prepare it for deployment.
The sample code is a starting point. It is industry validated, prescriptive but not definitive, and a peek under the hood to help you begin.
Related Content
[Title]
Disclaimer
The sample code; software libraries; command line tools; proofs of concept; templates; or other related technology (including any of the foregoing that are provided by our personnel) is provided to you as AWS Content under the AWS Customer Agreement, or the relevant written agreement between you and AWS (whichever applies). You should not use this AWS Content in your production accounts, or on production or other critical data. You are responsible for testing, securing, and optimizing the AWS Content, such as sample code, as appropriate for production grade use based on your specific quality control practices and standards. Deploying AWS Content may incur AWS charges for creating or using AWS chargeable resources, such as running Amazon EC2 instances or using Amazon S3 storage.
References to third-party services or organizations in this Guidance do not imply an endorsement, sponsorship, or affiliation between Amazon or AWS and the third party. Guidance from AWS is a technical starting point, and you can customize your integration with third-party services when you deploy the architecture.