AWS for Games Blog

Generate Custom Game Events from Unity Integrated With the Game Analytics Pipeline

Today’s game developers use analytics as a critical workload to deliver the best gameplay experiences possible for their players. As developers look to have more control over where their data is stored and to reduce the complexities of ingesting data from a variety of sources, many turn to AWS to help them create their own custom analytics pipelines and data lakes. While this data can be generated by any producer, from game servers to marketing services to publishing platforms, the most critical producer of data is player data generated by active users of a game on the game client and builds.

This blog post is a step-by-step tutorial that details how to ingest data from games developed in the Unity game engine using the AWS .NET SDKs with the one-click deployable Game Analytics Pipeline solution as a first step to setting up your own custom game analytics pipeline.

With this solution, as documented in our blog post “Ingest and Visualize Streaming Data for Games,” there are two approaches to ingesting telemetry data:

  1. Direct integration with Amazon Kinesis Data Streams: Choose this option if you want to publish events from your games and services directly to Amazon Kinesis Data Streams (KDS) without the solution’s API Gateway. This is useful if you are a game developer new to AWS real-time streaming analytics but familiar with C# and .NET libraries, or if you prefer to not manage API Gateway as part of your deployment. This also removes the added cost associated with using API Gateway.
  2. Proxy integration with the solution API events endpoint: Choose this option if you require custom REST proxy integration for ingestion of game events and prefer to not include information about your KDS stream on the client. Applications send events to the events endpoint that synchronously proxies the request to KDS and returns a response to the client.

Both of these methods can be used for cross-platform game releases including mobile games, PC games, and console games. This blog post specifically focuses on the first type of ingestion: direct integration with KDS using the AWS .NET SDKs.

Disclaimer: The code in this blog is meant for tutorial purposes only and is not production-ready code.

Requirements

Ensure you have access to the following before starting this tutorial:

  • An AWS account
  • The Game Analytics Pipeline solution deployed in your AWS account – follow the Deployment Guide for more information
  • Unity 2020.1.0f1 or higher – this solution has not been validated using lower versions of Unity
  • Intermediate level knowledge of C# and Unity

Download AWS .NET SDKs & Newtonsoft.Json

There are two ways to get the required AWS SDKs in order to start developing on AWS in Unity. You can either grab the NuGet packages in Visual Studio, or from the AWS owned GitHub repository of SDKs. The easiest method is to use Visual Studio Community, which is provided with the latest versions of Unity Pro.

  1. Open Visual Studio, select Project, and choose Manage NuGet Packages. Alternatively you can grab the SDKs from our GitHub repository.
  2. Search for the AWSSDK.Core, AWSSDK.Kinesis, AWSSDK.CognitoIdentity packages and add them to your project.
  3. This tutorial also uses the Newtonsoft.Json package to serialize dictionaries to JSON when sending data to Kinesis Data Streams, so you will need to add this to your project as well. When you download these packages using NuGet, they will download to your Packages folder in your Unity project.

Image of the NuGet package manager for .NET in Visual Studio showing how to download necessary AWS .NET SDKs for the project, such as AWSSDK.Core.

4. In Unity’s Player Settings, target API Compatibility Level .NET 4.X as pictured below. To do this, select Edit Project SettingsPlayerAPI Compatibility Level and choose .NET 4.X.

Image of the Unity Project Settings window instructing the developer to change the API Compatibility level to .NET 4.x

5. However, you will need to copy the .netstandard2.0 .dlls in your Unity project’s Plugins folder to support cross-platform compatability with most current game platforms. This will enable your game to support IL2CPP platforms like iOS, Android, Playstation 4, XBox One, and Switch.

Image of the downloaded SDK folder instructing the developer to grab the netstandard2.0 .dll files.

For more information about Unity’s scripting runtime and using .NET 4.X versus .NET Standard 2.0, visit the documentation for Unity’s .Net Profile support and Scripting restrictions.

6. Once your .dlls are copied, move them to the Plugins folder in the Unity project. If this is a new project and you do not see your Plugins folder, create one inside of the Assets folder.

Image of the Plugins folder within the Unity project with copied .dll files.

7. You should also create a link.xml file to minimize dependency issues between NuGet and Unity projects when building and deploying to device (more about link.xml files here). To do this, create a link.xml file in your Assets folder with the following code:

<linker>
  <assembly fullname="System.Core">
    <type fullname="System.Linq.Expressions.Interpreter.LightLambda" preserve="all" />
  </assembly>
  <assembly fullname="AWSSDK.Core" preserve="all"/>
  <assembly fullname="AWSSDK.CognitoIdentity" preserve="all"/>
  <assembly fullname="AWSSDK.Kinesis" preserve="all"/>
  <assembly fullname="AWSSDK.SecurityToken" preserve="all"/>
</linker>

This prevents Unity from stripping required data from the libraries when building. Some platforms, like iOS, even require a link.xml file to work on the device. The System.Linq line helps with IL2CPP platforms, like iOS, to avoid incompatibilities with Newtonsoft, the library we are using to serialize dictionaries to JSON. Learn more about link.xml files, .NET 4.x, and .NET Standard 2.0 requirements in the Visual Studio Tools for Unity documentation.

Setup an Amazon Cognito Managed Identity Pool

Next, create a Managed Identity Pool using Amazon Cognito. This allows users to assume a role-based identity using an Identity & Access Management policy and put records into your Kinesis stream.

  1. Get started in Amazon Cognito by selecting Manage Identity Pools, then select Create New Identity Pool.

Image of the Amazon Cognito landing page with the buttons Managed User Pools and Managed Identity Pools.

2. Enter in an Identity pool name and make sure to check Enable access to unauthenticated identities. Then select Create Pool. Unauthenticated identities are anonymous users who you will allow to put records into your stream.

A note about unique identifiers
Users who wish to collect user data, such as UUIDs or other unique identifiers, or add sign-up features, need to add authentication functionality into the client using the Amazon Cognito SDKs. This is not a requirement, and in some pipelines where data is privacy-centric or anonymized, it is not allowed. In the case of ingesting data with an API Gateway proxy instead of direct ingestion with the AWS SDK, users use Amazon Cognito to make authenticated and authorized POST calls using UnityWebRequest.

Image of the “Create new identity pool” Amazon Cognito settings page

3. You will be prompted to create a new IAM role that users in this unauthenticated identity pool can assume. Click the Show Policy document drop down.

4. If you are new to AWS, the default role allows cognito-sync and PutEvents, but has no resources that it can act on. As this is not what we are using, the role must be edited for your records to be put into the stream and to remove access for services you are not using.

Instead of the default, your role should match the following snippet to work with this tutorial:

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "kinesis:PutRecord",
                "kinesis:PutRecords"
            ],
            "Resource": [
                "COPY-ARN-FOR-THE-SOLUTION-STREAM-HERE"
            ]
        }
    ]
}

This role will allow you to call both PutRecord and PutRecords on the Kinesis Stream resource. Make sure to copy the ARN for the Kinesis Stream that was created by the one-click deployable solution. Once you’ve adjusted the role, click Allow.

5. You will find sample code under Get AWS Credentials. Copy the ID next to “// Identity Pool ID” in prep for the next step.

Set up a script to handle your records, batching, and ingestion

  1. Create an Events.cs script with the following details.
using System;
using UnityEngine;
using System.IO;
using System.Collections.Generic;
using System.Threading.Tasks;
using Amazon;
using Amazon.CognitoIdentity;
using Amazon.Kinesis;
using Amazon.Kinesis.Model;
using Newtonsoft.Json;

public class Events
{

    // The number of records collected before a batch is sent to Amazon Kinesis 
    // Streams. In production this should be much higher, but for this demo 
    // script it is set to 4
    static int batchSize = 4;

    // A list that holds our records to batch them
    static List<object> raw_records = new List<object>();
    
    // Initialize the Amazon Cognito credentials provider
    private static CognitoAWSCredentials credentials = new CognitoAWSCredentials(
        "YOUR-IDENTITY-POOL-ID", // Identity pool ID
        RegionEndpoint.USEast1 // Cognito Identity Pool Region
    );

    // Kinesis Stream Details
    private static RegionEndpoint KinesisRegion = RegionEndpoint.USEast1;
    private static string streamName = "YOUR-STREAM-NAME"; // Kinesis Stream Name
    private static AmazonKinesisClient kinesisClient = 
        new AmazonKinesisClient(credentials, KinesisRegion);
    
}

2. In the Events.cs script where you’d like to put records into your Kinesis Stream, set up your credentials as shown in the code above. Replace YOUR-IDENTITY-POOL-ID with the identity pool ID generated. Make sure your RegionEndpoints match the region endpoints for the stream and identity pools.

Add batching, schema formatting, and PutRecordsAsync

This is the primary portion of the script. The Game Analytics Pipeline requires a specific schema in order for records to be added to the stream correctly.

  1. Copy the following code into your Events.cs file and replace YOUR-APPLICATION-ID with the application ID provided by the one-click deployable solution. To find your application ID, in your CloudFormation’s stack for the one-click deployable solution, click Outputs and search for TestApplicationId.
 // Creates a UUID
    static string GetUUID()
    {
        return System.Guid.NewGuid().ToString();
    }
    
    // Create Record enriches event data with additional parameters and 
    // converts to JSON
    private static void Create_Record(Dictionary<string, object> event_data, 
        string event_name)
    {
        string event_id = GetUUID();
        string partitionKey = event_id;

        Int64 current_time = 
            (Int64)new DateTimeOffset(DateTime.UtcNow).ToUnixTimeSeconds();

        Dictionary<string, object> record = new Dictionary<string, object>()
        {
            { "event_id", event_id },
            { "event_type", "event_type" },
            { "event_name", event_name },
            { "event_timestamp", current_time },
            { "event_version", "1.0.0" },
            { "app_version", "1.0.0" },
            { "event_data", event_data }
         };

        //Add to the Batch of Records
        Generate_Batch(record, partitionKey);

    }

    // Generate Batch
    private static void Generate_Batch(Dictionary<string, object> record, 
        string partitionKey)
    {
        // Application ID from the Solution
        string application_id = "YOUR-APPLICATION-ID"; 

        // Append Raw Records with new Record
        raw_records.Add(new Dictionary<string, object>()
        {
            { "event", record },
            { "application_id", application_id }
        });

        Debug.Log("Added record to list" + raw_records.Count);

        if (raw_records.Count >= batchSize)
        {
            //Call Put Record
            Put_Records(raw_records, partitionKey);
        }

    }

    // Put Records
    private static async void Put_Records(List<object> raw_records, 
        string partitionKey)
    {
        Debug.Log("Put Records Called");

        List<PutRecordsRequestEntry> formatted_records = 
            new List<PutRecordsRequestEntry>();

        foreach (object rec in raw_records)
        {
            using (var memoryStream = new MemoryStream())
            using (var streamWriter = new StreamWriter(memoryStream))

            {

                //Convert to Json using Newtonsoft
                string jsonData = JsonConvert.SerializeObject(rec, 
                    Formatting.Indented);

                Debug.Log("Record To be Sent:" + jsonData);

                streamWriter.Write(jsonData);
                streamWriter.Flush();

                formatted_records.Add(new PutRecordsRequestEntry
                {
                    Data = memoryStream,
                    PartitionKey = partitionKey
                });

            }
        }

        Debug.Log("Record Formatted");

        Task <PutRecordsResponse> responseTask = 
            kinesisClient.PutRecordsAsync(new PutRecordsRequest
        {
            Records = formatted_records,
            StreamName = streamName
        });

        PutRecordsResponse responseObject = await responseTask;

        Debug.Log("Event Sent" + "\n" + "Successful Records Sent:" + 
            responseObject.Records.Count + "\n"
            + "Failed Records:" + responseObject.FailedRecordCount);

        //Clears raw records after they are sent for demo.
        //In production, change to only clear on successful response.
        raw_records.Clear();
    }
 
       

PutRecordsAsync uses a memory stream to Base 64 encode the data before sending it into the Kinesis Stream.

A note about batching
Batching your records before sending them to your Amazon Kinesis stream will enable you to call fewer PutRecords requests and is both efficient and a way to cost optimize your communication. In the sample above, the batch size is set to 4 to allow you to get it working, but your game in production should be set higher. Each PutRecords request can support up to 500 records and reach record can be as large as 1 MB up to a limit of 5 MB for the entire request. For more information about Kinesis Streams quotas and limits visit the documentation here.

Other considerations
The previous script does not handle retries or situations where players have backgrounded or closed the app, but the batch has not been sent. Before pushing to production, developers should write additional logic to handle these cases.

Create an Event with Custom Parameters

Now you’re ready to create your first event! In the same Events.cs class, add a “Game Over“ event that we will call from somewhere else in the game. It is recommended that you store the custom parameter schemas and templates for all events in one file so they are easy to update, but call them elsewhere in the game.

// Creates a Game Over Event
    public static void Create_Game_Over_Event(int wins, int losses)
    {
        string event_name = "gameover";

        // Generate Event Custom Parameters
        Dictionary<string, object> event_data = new Dictionary<string, object>()
        {
             { "wins", wins },
             { "losses", losses },
             { "platform", "UnityEditor" }
        };

        Create_Record(event_data, event_name);
    }

This event accepts wins and losses as ints as an example, but the event_data can be completely customized to meet your ideal event parameters. It calls the Create_Record method that is defined previously in the Events.cs file and sends the wins and losses data to be sent to Kinesis.

Call “Game Over“ event

Somewhere else in your game, either tied to a button or specific action, call “Game Over” and pass wins and losses ints.

Events.Create_Game_Over_Event(wins, losses);

Based on the batch size of 4, this event will need to be called four times before it is put into the Kinesis Stream and then finally into S3. When calling this event, make sure to pass int wins and int losses parameters to the event.

Verify data ingestion by checking S3 & Athena

In Unity, you should also see a Debug log that confirms records were successfully sent if you use the copy above.

Image of the debug logs printed in the Unity game engine.

  1. Go to the AWS Management Console and search for Amazon S3.
  2. If your records are successfully put into the stream you should see records in your S3 bucket generated by the one-click deployable solution titled “[Cloudformation-name]-analyticsbucket-[random string of numbers/letters]”.

Image of an Amazon S3 bucket and prefixes

3. Under raw_events you should see partitions titled “year=2020” followed by month and day folders. Diving into these, you hopefully will see a file that looks like this:

Image of a file stored in Amazon S3

4. Now head over to Amazon Athena, which is an interactive query service that makes it easy to write and run ad-hoc queries on data stored in S3.

5. Go to Saved queries, and run a sample query on the gameeventsdatabase that was generated by the one-click deployable solution (check the outputs tab in CloudFormation). The following example depicts this running on partition day 3 looking at the raw_events table.

Image of an Amazon Athena query

If you are using a similar query above to the raw_events table and see your events results show for the day that you sent these events, for example day = ‘03’, you’ve successfully put your records into the stream.

Troubleshooting

Running into 400 Bad Request errors? Schema mis-matches? Anything else? Check the following:

  • Unable to put records – Make sure your IAM Role, under Identity & Access Management, in AWS has both PutRecord and PutRecords as indicated in step 2 of the Setup an Amazon Cognito Managed Identity Pool section.
  • Namespace errors – Make sure you have all the required .dlls. When in doubt, search in the AWS SDK for .NET Version 3 API reference documentation to see which AWS SDK is required for methods the script might be calling that are missing.
  • 400 Bad Request – This usually indicates the request was not accepted, which means something went wrong with PutRecordsAsync in Step 4 or your IAM role has incorrect permissions. Double check your Put_Records code and your IAM role for Amazon Cognito in AWS.

Next steps

Fantastic! Now that you’ve successfully ingested custom data into your game analytics pipeline and into your S3 data lake, you have a world of endless possibilities for your game analytics events. From here we recommend building out additional template events using the event_data parameters that meet your game’s specific tracking needs, investigating the full capabilities of our game analytics pipeline solution, or setting up your own QuickSight dashboard.