Overview
This Guidance demonstrates how to implement an automated machine learning (ML) pipeline to gain real-time player insights, powered by artificial intelligence (AI), enabling studios to better understand player behavior and improve the overall game experience. Game studios can leverage this low-code solution to quickly build, train, and deploy high-quality models that predict player behavior using their own gameplay data. Operators simply upload player data to Amazon Simple Storage Service (Amazon S3), which invokes an end-to-end workflow to extract insights, select algorithms, tune hyperparameters, evaluate models, and deploy the best performing model to a prediction API. This automated process requires no manual machine learning tasks while delivering real-time predictions that give studios valuable insights into individual player retention, engagement, and monetization to inform data-driven decisions that improve gameplay.
How it works
These technical details feature an architecture diagram to illustrate how to effectively use this solution. The architecture diagram shows the key components and their interactions, providing an overview of the architecture's structure and functionality step-by-step.
Well-Architected Pillars
The architecture diagram above is an example of a Solution created with Well-Architected best practices in mind. To be fully Well-Architected, you should follow as many Well-Architected best practices as possible.
Implementation resources
The sample code is a starting point. It is industry validated, prescriptive but not definitive, and a peek under the hood to help you begin.
Open sample code on GitHub
Disclaimer
The sample code; software libraries; command line tools; proofs of concept; templates; or other related technology (including any of the foregoing that are provided by our personnel) is provided to you as AWS Content under the AWS Customer Agreement, or the relevant written agreement between you and AWS (whichever applies). You should not use this AWS Content in your production accounts, or on production or other critical data. You are responsible for testing, securing, and optimizing the AWS Content, such as sample code, as appropriate for production grade use based on your specific quality control practices and standards. Deploying AWS Content may incur AWS charges for creating or using AWS chargeable resources, such as running Amazon EC2 instances or using Amazon S3 storage.
Did you find what you were looking for today?
Let us know so we can improve the quality of the content on our pages