AWS for Games Blog
Generate custom game events from Unreal Engine with the Game Analytics Pipeline
Today’s game developers use analytics as a critical workload to deliver the best gameplay experiences possible for their players. As developers look to have more control over where their data is stored and to reduce the complexities of ingesting data from a variety of sources, many turn to AWS to help them create their own custom analytics pipelines and data lakes. While this data can be generated by any producer, from game servers to marketing services to publishing platforms, the most critical producer of data is player data generated by active users of a game on the game client and builds.
This blog post is a step-by-step tutorial that details how to ingest data from games developed in the Unreal game engine using the AWS C++ SDKs with the one-click deployable Game Analytics Pipeline solution as a first step to setting up your own custom game analytics pipeline.
With this solution, as documented in our blog post “Ingest and Visualize Streaming Data for Games,” there are two approaches to ingesting telemetry data:
- Proxy integration with the solution API events endpoint: Choose this option if you require custom REST proxy integration for ingestion of game events and prefer to not include information about your KDS stream on the client. Applications send events to the events endpoint that synchronously proxies the request to KDS and returns a response to the client. This option provides more control since you are able to leverage security services for better edge protection, such as AWS WAF and AWS Shield.
- Direct integration with Amazon Kinesis Data Streams: Choose this option if you want to publish events from your games and services directly to Amazon Kinesis Data Streams (KDS) without the solution’s API Gateway. This is useful if you are a game developer new to AWS real-time streaming analytics but familiar with C++ libraries, or if you prefer to not manage API Gateway as part of your deployment. This also removes the added cost associated with using API Gateway.
Both of these methods can be used for cross-platform game releases including mobile games, PC games, and console games. This blog post specifically focuses on the first type of ingestion: direct integration with KDS using the AWS C++ SDKs.
This blog post contains many sections on concepts and architecture that are similar or identical to those in a past post about Unity, which can be found here. However, the code integrations and engine settings in this blog post have a specific focus on Unreal Engine and AWS C++ SDK, as opposed to the Unity Engine with AWS .NET SDK.
Disclaimer: The code in this blog is meant for tutorial purposes only and is not production-ready code.
Requirements
Ensure you have access to the following before starting this tutorial:
- An AWS account
- The Game Analytics Pipeline solution deployed in your AWS account – follow the Deployment Guide for more information
- Unreal Engine 4 or higher (Unreal Engine 5 is supported) – this solution has not been validated using lower versions of Unreal
- Integrating the AWS C++ SDK through completion of this blog post
- Intermediate level knowledge of C++ and Unreal Engine
Initial Setup
Continuing from setting up the AWS C++ SDK with Unreal Engine, we will first be adding the “Json” and “JsonUtilities” modules to the project build file.
- Navigate to your Project’s Build.cs file (should be in a location like:
[ProjectName]\Source\[ProjectName]\[ProjectName].Build.cs)
. Add the Json and JsonUtilities Modules you created earlier as a dependency (You will see below I added “Json” and “JsonUtilities” in the list).
ExampleProject.Build.cs
Setup an Amazon Cognito Managed Identity Pool
Next, create a Managed Identity Pool using Amazon Cognito. This allows users to assume a role-based identity using an Identity & Access Management policy and put records into your Kinesis stream.
- Get started in Amazon Cognito by selecting Manage Identity Pools, then select Create New Identity Pool.
- Enter in an Identity pool name and make sure to check Enable access to unauthenticated identities. Then select Create Pool. Unauthenticated identities are anonymous users who you will allow to put records into your stream.
A note about unique identifiers
If you wish to collect user data such as unique identifiers or session state, or add user customized features, you’ll need to add authentication into the client using both the Amazon Cognito SDKs and Amazon Cognito User Pools. For the purposes of this blog, the use case focuses on gathering data from anonymous users, which requires the use of Amazon Cognito Unauthenticated Identities.
- You will be prompted to create a new IAM role that users in this unauthenticated identity pool can assume. Click the Show Policy document drop down.
- If you are new to AWS, the default role allows cognito-sync and PutEvents, but has no resources that it can act on. As this is not what we are using, the role must be edited for your records to be put into the stream and to remove access for services you are not using.
Instead of the default, your role should match the following snippet to work with this tutorial. Ideally, permissions for this Role should be minimal since it is intended for guest access:
This role will allow you to call both PutRecord and PutRecords on the Kinesis Stream resource. Make sure to copy the ARN for the Kinesis Stream that was created by the one-click deployable solution. Once you’ve adjusted the role, click Allow.
- You will find sample code under Get AWS Credentials. Copy the ID next to “// Identity Pool ID” in prep for the next step.
Note: The bottom of the blog will have the full, completed script to help while you follow along the below steps
Modify your script to add Cognito Identity Pools
First we will add code changes to the MyActor header and cpp files to connect to an AWS Cognito Identity Pool, retrieve an AWS Access Key and Secret Key, and set as an ephemeral variable for the AWS SDK Kinesis Client that we create in the later steps.
- Append your MyActor.h file’s header portion with the following:
- Add the following variable and function declarations to the MyActor.h file:
- Modify your MyActor.cpp file with the following code:
- Replace the variable’s values with the corresponding:
- AWS_ACCOUNT_ID: The AWS Account ID that the Game Analytics Pipeline and Cognito Identity Pool are in. Example: 912345678901
- AWS_REGION: The AWS Region that the Game Analytics Pipeline and Cognito Identity Pool are in. The format must follow as the AWS C++ SDK documentation states. Example: US_EAST_1.
- COGNITO_IDENTITY_POOL_ID: The Cognito Identity Pool ID you retrieved from step 6 under the “Setup an Amazon Cognito Managed Identity Pool“ section. Example: us-east-1:234a5b67-8901-2a3b-4567-c89012d34e56
- Replace the BeginPlay function in MyActor.cpp:
- Note the comments and authentication process, and try running the script with the actor in your scene to verify functionality (and optionally uncomment the print statements to see the credential information as well)
Modify your script to handle your records, batching, and ingestion
Next we append to the MyActor header and cpp files to set functionality that, upon trigger, will create event records, combine into a batch, and at a certain point send the event to the Kinesis Data Stream using the AWS C++ SDK.
- Append your MyActor.h file’s header portion with the following:
- Add the following variable and function declarations to the MyActor.h file:
- Modify your MyActor.cpp file with the following code:
- Replace the variable’s values with the corresponding:
- GAP_APPLICATION_ID: The Application ID sent to the Game Analytics Pipeline, which tracks a UUID for the application in the case of multiple games/applications. To find your application ID, in your CloudFormation’s stack for the one-click deployable solution, click Outputs and search for TestApplicationId. Example: 234a5b67-8901-2a3b-4567-c89012d34e56
- KINESIS_STREAM_NAME: The name of the Kinesis stream that the events will be sent to. To find your stream name, in your CloudFormation’s stack for the one-click deployable solution, click Outputs and search for GameEventsStream. Example: GAP-stack-GameEventsStream-1ABCd2EfghIj.
- Now we will add the primary portion of the script. The Game Analytics Pipeline requires a specific schema in order for records to be added to the stream correctly, which is broken down in the following general format:
- Our code will do the following:
- The CreateGameOverEvent function will create the innermost nested JSON, the “event_data” portion, based on the parameters passed in when we trigger this function. The “event_data” can be completely customized to meet your ideal event parameters. Then calls the below function.
- The CreateRecord function will grab the “event_data” and both encapsulate and enrich the data with the other parameters, such as “event_id”, “event_name”, “event_timestamp”, so that the event can be properly partitioned and be queried. Then calls the below function.
- The GenerateBatch function will grab the above data and encapsulate with a top-level “event” parameter, and the “application_id” set with the Game Analytics Pipeline Application ID, append to a global array holding the batch of events, and once it reaches a certain size, package and send to Kinesis, which would call the below function.
- The PutRecords function will grab the current filled batch of events, iterate through the batch, and for each event in the batch serialize the data to a string, then into byte data, then into a Kinesis PutRecords Request Entry, then batched back into a PutRecords Request, which will then be the data sent in a PutRecordsAsync call, which will use a memory stream to Base 64 encode the data, then send it asynchronously to the Kinesis Data Stream. A callback is added to the async call to trigger when we get a response, which would call the below function.
- The OnPutRecordsAsyncOutcomeReceived function is optional, but will execute actions upon a response from Kinesis. In this case, the actions are to either print a success or an error message based on the Kinesis response.
- We also have a UUID helper function to help convert Unreal Engine’s GUID object to a universal UUID format. Unreal Engine has a format for GUIDs that look like the following: “234A5B6789012A3B4567C89012D34E56”. The helper function will convert to lower case and add dashes to look like the following: “234a5b67-8901-2a3b-4567-c89012d34e56”.
A note about batching
Batching your records before sending them to your Amazon Kinesis stream will enable you to call fewer PutRecords requests and is both efficient and a way to cost optimize your communication. In the sample above, the batch size is set to 4 to allow you to get it working, but your game in production should be set higher. Each PutRecords request can support up to 500 records and reach record can be as large as 1 MB up to a limit of 5 MB for the entire request. For more information about Kinesis Streams quotas and limits visit the documentation here.
Other considerations
The previous script does not handle retries or situations where players have backgrounded or closed the app, but the batch has not been sent. Before pushing to production, developers should write additional logic to handle these cases.
- Append the following functions to the MyActor.cpp file:
Add key inputs to trigger game events
Now that we have created the game event and the event generation, batching, and ingestion process, we need to find a way to trigger the process and watch it in action. Somewhere else in your game, either tied to a button or specific action, you would call “Game Over” and pass wins and losses to the event.
In the below example, it is assumed you are using an example keybind to send your events and call “Game Over” in a “character”. To achieve this without creating a new character script and calling MyActor, you can temporarily change MyActor to a character subclass to allow keybind actions by:
- Changing #include “GameFramework/Actor.h” in the MyActor.h header file to #include “GameFramework/Character.h”
- Changing “MyActor : public AActor” in the class declaration to “MyActor : public ACharacter”.
- Adding #include “Components/InputComponent.h” and #include “GameFramework/InputSettings.h” to the MyActor.cpp file
Refer to the full code snippets at the very bottom of the blog for guidance on code integration.
- Append the following class and function to the “MyActor.h” header file:
- Append the following function to the “MyActor.cpp” file:
- Note the keybind names in the BindAction function call are “WinGame” and “LoseGame”. To set the keybinds, in the Unreal Editor go to Edit → Project Settings → Input → Bindings → Action Mappings and add “WinGame” and “LoseGame” (case sensitive) with the respective keys. When starting the level, based on the batch size of 4, the keys will need to be pressed four times before it is put into the Kinesis Stream and then finally into S3.
Verify data ingestion by checking S3 & Athena
- Go to the AWS Management Console and search for Amazon S3.
- If your records are successfully put into the stream you should see records in your S3 bucket generated by the one-click deployable solution titled “[Cloudformation-name]-analyticsbucket-[random string of numbers/letters]”.
- Under raw_events you should see partitions titled “year=2022” (or current year) followed by month and day folders. Diving into these, you hopefully will see a file that looks like this:
- Now head over to Amazon Athena, which is an interactive query service that makes it easy to write and run ad-hoc queries on data stored in S3.
- Go to Saved queries, and run a sample query on the gameeventsdatabase that was generated by the one-click deployable solution (check the outputs tab in CloudFormation). The following example depicts this running on partition day 3 looking at the raw_events table.
- If you are using a similar query above to the raw_events table and see your events results show for the day that you sent these events, for example day = ‘03’, you’ve successfully put your records into the stream.
Troubleshooting
Running into 400 Bad Request errors? Schema mis-matches? Anything else? Check the following:
- Unable to put records – Make sure your IAM Role, under Identity & Access Management, in AWS has both PutRecord and PutRecords as indicated in step 2 of the Setup an Amazon Cognito Managed Identity Pool section.
- Namespace errors – Make sure you have all the required .dlls. When in doubt, refer back to the AWS C++ SDK with Unreal Engine blog post to see which AWS SDK is required for methods the script might be calling that are missing.
- 400 Bad Request – This usually indicates the request was not accepted, which means something went wrong with PutRecordsAsync in Step 4 or your IAM role has incorrect permissions. Double check your Put_Records code and your IAM role for Amazon Cognito in AWS.
Next steps
Fantastic! Now that you’ve successfully ingested custom data into your game analytics pipeline and into your S3 data lake, you have a world of endless possibilities for your game analytics events. From here we recommend building out additional template events using the event_data parameters that meet your game’s specific tracking needs, investigating the full capabilities of our game analytics pipeline solution, or setting up your own QuickSight dashboard.
Full Code Reference
MyActor.h
MyActor.cpp