AWS for Games Blog

How Games24x7 delivers millions of push notification using AWS

This post is co-authored by Sachin Sangle, Senior Software Engineer; Deepak Mishra, Engineering Manager; Dhiraj Prajapati, Engineering Manager; and Anil Kumar, Senior Engineering Manager, at Games24x7.

Games24x7 is India’s most valuable multi-game platform. It is fueled by a dedication to blend science with cherished nostalgic games, crafting captivating online experiences for over 100 million players. Games24x7 have established themselves as a leading platform for skill-based games. They offer an immersive experience in games like Rummy, Fantasy Sports and Poker to their application users. Real-time, intuitive player communication has been pivotal for their success and remains a prime focus for sustaining rapid growth. During major events like IPL (Indian Premier league), Cricket World Cup, and others, as a crucial communication channel. Successful customer communication entails orchestrating a campaign targeting different user cohorts to send push notifications at large scale within seconds.

Push notification is the most cost-effective channel for communicating with active app users with an option of using an image to convey a succinct message. Games24x7 uses push notifications to reach over 80% of their active user base. It accounts for more than 60% of their daily revenue in their fantasy cricket platform operations. These notifications serve as reminders, are able to generate intrigue, reinforce their brand, and enhance user convenience.

Customer challenges

Games24x7 was facing a pressing issue due to delays in delivering important notifications during critical use cases. For example, a Cricket Toss push notification, alerts users and leads them directly to the page/section for optimizing fantasy teams based on the toss. Users can quickly optimize their fantasy team based on playing team members before the match begins. Games24x7 recognizes the importance of resolving this problem swiftly to engage users. It aims to solidify its lead in the fast-paced gaming sector as the time window between toss and match start is very small.

Games24x7 encountered challenges with legacy system, that experienced problems like queueing delays, filtration delays, and complex processing sequences. To address these obstacles efficiently, they looked for an internal solution. The result was an architecture crafted around a generic model comprising of data collection, campaign creation, data filtration workflow, and execution workflow. Games24x7 transformed their push notification delivery into an agile powerhouse through orchestration of AWS services and streamlining workflows. This enabled them to engage with the audiences with higher precision and speed by sending millions of push notifications within seconds.

Solution overview: Transforming push notification delivery

In response to the challenges faced by Games24x7’s legacy system in delivering push notifications effectively, a comprehensive solution was devised that addressed the communication challenges. This solution comprises a well-structured architecture and strategic workflows aimed at streamlining the delivery process, while maintaining precision and speed. The solution’s components include:

  1. Data Collection: Gathering information from MongoDB and MySQL, both self-managed on EC2 , ensuring comprehensive data availability.
  2. Campaign Creation: Designing targeted campaigns hyper-personalized to different user cohorts for customized push notifications.
  3. Data Filtration Workflow: Employing AWS services and distributed processing to filter raw data efficiently, enhancing the quality and relevance of notifications.
  4. Execution Workflow: Orchestrating the actual push notification delivery through AWS services like Amazon S3, AWS Lambda and AWS Step Functions.
  5. Scalability and Efficiency: Leveraging serverless architecture for agility, scalability and cost-effectiveness, ensuring seamless delivery of notifications even in high-demand scenarios.
  6. Results and Analysis: Generating consolidated reports from each Lambda invocation to show how many users were targeted as part of this campaign.
  7. Flexibility: A generic model that can accommodate push notifications, SMS messages and emails, adapting to the evolving needs of Games24x7 and addressing potential future challenges.

Architecture that enables Games24x7 to send push notifications to millions of devices

Solution

Games24x7 had an existing push notification service, but it lacked efficiency. The entire process of filtration and notification required 30-35 minutes to deliver notifications to users . Despite attempting optimizations, Games24x7 couldn’t achieve the desired speed for sending push notifications.

Games24x7 has re-architected their approach by breaking down single steps into multiple ones, and aligning AWS services with their specific needs. AWS Step Functions has been integrated within the architecture to synchronize extensive parallel workloads for diverse tasks. These workloads streamline the concurrent handling of around 30GB of data per campaign stored within Amazon S3. Games24x7 processes over 50 campaigns daily and it can vary from day to day.

To achieve a comprehensive parallel workload in their workflows, Games24x7 segments the entire user base into chunks and integrates a map state in distributed mode. This incorporation of distributed mapping empowers the process to run in parallel. AWS Step Functions, through this state, effectively oversees the concurrent processing of items within datasets. When set up in distributed mode, the map state functions as a distributed map state, optimizing high-concurrency processing.

By leveraging AWS Step Functions Distributed Map, Games24x7 have reduced the overall delivery of notifications to end users.

Architecture User base Data filtration Execution
Legacy Architecture ~30 million 15 to 20 min ~15 min
New Architecture using AWS Step Functions and AWS Lambda ~30 million+ ~2min ~2.1min

 

The detailed steps are as follows:

  1. Data collection

Data collection workflow from MongoDB and BI System

  • The data collection phase facilitates the gathering of information from diverse sources present within the organization.
  • Queries are constructed to extract user-specific attributes from the source database using a data exporter tool.
  • The extracted information is then exported into a CSV file for further use or analysis.
  • The data that is exported using database export has the following format: 1234, messaging id, user attr1, user attr2...n
  • Since this data is in its raw form, it is divided into multiple chunks before streaming it into the Amazon S3 bucket.  aws s3 cp s3://$bucketname/$filename - | split -d -l $splitsize –filter “aws s3 cp - \”s3://$bucketname/$filename”
  1. Data filtration workflow

Data Filtration workflow using Step Function with Distributed Map

  • A workflow for preparing data that can be activated either by sending a request to the Amazon API Gateway endpoint or by scheduling for automatic execution via the Amazon EventBridge scheduler.
  • As part of data filtering process, Games24x7 established an AWS Step Functions workflow that iterates through an Amazon S3 bucket housing raw data files. This workflow invokes multiple AWS Lambda functions, each handling a data chunk for filtration. Here a distributed map for AWS Step Functions is utilized to efficiently parallelize the data processing.
  • Games24x7 leverages Amazon S3 Select to extract specific data from raw dataset through filtering, and these distinct chunks of data would be queried by AWS Lambda invocation.
  • Streaming the result to the local file system can expedite the process. Upon completion, upload the file to the Amazon S3 bucket.
  • Furthermore, the invocation phase is vital. Games24x7 ensured balanced distribution by breaking the resulting file into multiple chunks if it exceeds ‘X’ records. This maintains an even distribution and keeps processing time reasonable for each data chunk.

Query CSV file from S3 bucket

Linked Java code demonstrates the process of querying a CSV file and extracting the user cohort.

  1. Data execution workflow

Data Execution workflow for push notification, SMS messages and emails

  • This workflow involves the actual sending of push notifications, SMS messages, and emails.
  • This execution workflow can be initiated either through an Amazon API Gateway endpoint or through a scheduling mechanism where Amazon EventBridge scheduler is used.
  • This workflow requires specific inputs such as the location of filtered data and the payload in a particular format. Filtered data can be split, which was prepared in the previous workflow, or incorporate a file generated by an external system.
  • The workflow initiates a Step Functions state machine, which iterates through an S3 bucket to select relevant data files. Each selected data file is then distributed to individual AWS Lambda function using an AWS Step Functions distributed map.
  • Every Lambda function operates as a worker, with customizable capacity to ensure it adheres to the concurrency limit.
  • After completing its assigned work chunk, each worker (Lambda) is responsible for generating a report and pushing it to an S3 bucket.
  • Once all the workers have completed their assigned work chunks and generated their reports, the reporting data is consolidated into a single file using a AWS Lambda function.
  • Finally, the consolidated report file is uploaded to an S3 bucket and a pre-signed URL is generated for downloading the report for business analysis.

Conclusion

With this architecture model, Games24x7 have achieved the ability to send over 30 million push notifications within 4-5 minutes. They are now able to run large campaigns multiple times a day using this solution. In addition, the entire solution is serverless, which means they don’t have to bear any maintenance overhead or additional resource cost when campaigns are not running.

Additionally, each step of this solution is designed in a generic manner, which allows Games24x7 to use the output of a previous state, or any external file, as an input for the ingestion pipeline. Each phase of this solution can be used separately to address specific requirements. For example, the data preparation phase can be used to filter and preprocess large data sets. Whereas the notification phase can be used to send push notifications or other types of alerts to users.

References