Desktop and Application Streaming

Capture usage insights about your streaming environment with AppStream 2.0 session scripts

AWS is announcing that Amazon AppStream 2.0 now supports custom script execution when a session starts, and when it stops.

Customers have said that they want to configure their applications and streaming environments before the session starts, or perform clean-up tasks before the session stops. Previously, you would use Windows Group Policy scripts or scheduled tasks with complex logic to ensure that they ran at the appropriate time.

The AppStream 2.0 session scripts feature runs custom scripts at session startup before the user’s application starts. You can also configure scripts to run at the end of their streaming session. The session scripts feature enables the configuration of applications before a session starts, and the collection of session information and application cleanup when the session stops. For more information about the sessions script feature and how to use it, see the Sessions Scripts Technical Documentation.

This post shows you how to use:

  1. The user and instance metadata capability to store session information and application logs at session termination in Amazon S3.
  2. Amazon Athena to query and retrieve that data.

Walkthrough

This post provides the following high-level flow for storing session information and application logs when the session stops:

  1. Create your custom script, and upload it to an image builder.
  2. Update the session scripts config JSON file with the details of your script.
  3. Create the AppStream 2.0 image, and apply it to a fleet.
  4. Configure Athena to parse the collected session information stored in Amazon S3.
  5. Run a SQL query to retrieve the data.

Create a custom script to capture the user and instance metadata

The following example script shows three actions:

  1. Collecting information about the user session from the user and instance metadata Windows environment variables.
  2. Saving this information in a .csv file.
  3. Uploading the information in the .csv file to an Amazon S3 bucket.

Windows system events are also collected that helps with troubleshooting application issues.

NOTE: The session termination PowerShell script has an IAM user credential saved in it. This makes it important that this user only has permissions to perform the file uploads to your Amazon S3 bucket.

################
$AccessKeyID = "IAM_USER_ACCESS_ID"
$SecretAccessKey = "IAM_USER_ACCESS_KEY"
$BucketName = "S3_BUCKET_NAME"
$BucketRegion = "S3_BUCKET_REGION"
################

$InstanceAWSRegion = (Get-Item Env:AWS_Region).Value
$StackName = (Get-Item Env:AppStream_Stack_Name).Value
$FleetName = (Get-Item Env:AppStream_Resource_Name).Value
$AccessMode = (Get-Item Env:AppStream_User_Access_Mode).Value
$ImageArn = (Get-Item Env:AppStream_Image_ARN).Value
$SessionID = (Get-Item Env:AppStream_Session_ID).Value
$SessionReservationDateTime = Get-Date ((Get-Item env:AppStream_Session_Reservation_DateTime).Value)
$SessionEndDateTime = Get-Date
$InstanceType = (Get-Item Env:AppStream_Instance_Type).Value
$AS2UserName = (Get-Item Env:AppStream_UserName).Value
$Duration = [math]::Round((New-TimeSpan -Start (Get-Date -Date $SessionReservationDateTime) -End ($SessionEndDateTime)).TotalSeconds)

$FileName = "UserSessionData.$($AS2Username).$($SessionID).csv"
$FileName2 = "$($AS2UserName).$($SessionID).system.eventlog.log"
$folderpath = "C:\output"
$Row = "$($InstanceAWSRegion),$($StackName),$($FleetName),$($AccessMode),$($AS2UserName),$($ImageARN),$($InstanceType),$($SessionID),$($SessionReservationDateTime.ToString("yyyy-MM-dd HH:mm:ss")),$($SessionEndDateTime.ToString("yyyy-MM-dd HH:mm:ss")),$($Duration)"

$S3ObjectPath = "UserSessionData/$($InstanceAWSRegion)/$($StackName)/$($AS2UserName)/"
$S3ObjectPath2 = "EventLogs/$($InstanceAWSRegion)/$($StackName)/$($AS2UserName)/$($InstanceID)"

if (!(test-path $folderpath)) {
    
    New-Item -Path $folderpath -ItemType Directory -Force
}

Add-Content -Path "C:\output\$($FileName)" -Value $Row
Get-WinEvent -LogName "System" | Where-Object {$_.TimeCreated -ge $SessionReservationDateTime -and $_.TimeCreated -le $(Get-Date)} | out-file -filepath C:\output\$filename2

Write-S3Object -BucketName $BucketName -File "C:\output\$($FileName)" -Key "$($S3ObjectPath)$($FileName)" -AccessKey $AccessKeyID -SecretKey $SecretAccessKey -Region $BucketRegion
Write-S3Object -BucketName $BucketName -File "C:\output\$($filename2)" -Key "$($S3ObjectPath2)$($FileName2)" -AccessKey $AccessKeyID -SecretKey $SecretAccessKey -Region $BucketRegion

Save this script to a convenient location on a running image builder. For this example, we use C:\Scripts\session_termination.ps1.

Update the session scripts config JSON with the details of your script

You tell AppStream 2.0 what scripts to run and when, by updating the session scripts configuration JSON on an image builder. The configuration JSON defines:

  • Which session event to run on – start or termination
  • The Windows context to run within – User or System
  • The file path and any arguments
  • Whether to upload script output logs to Amazon S3
  • Waiting time for script execution – up to 60 seconds

Here is the configured session scripts JSON for our example:

{
    "SessionStart": {
        "executables": [{
                "context": "system",
                "filename": "",
                "arguments": "",
                "s3LogEnabled": true
            },
            {
                "context": "user",
                "filename": "",
                "arguments": "",
                "s3LogEnabled": true
            }
        ],
        "waitingTime": 30
    },
    "SessionTermination": {
        "executables": [{
                "context": "system",
                "filename": "",
                "arguments": "",
                "s3LogEnabled": true
            },
            {
                "context": "user",
                "filename": "C:\\Windows\\System32\\WindowsPowerShell\\v1.0\\powershell.exe",
                "arguments": "-NonInteractive -File C:\\Scripts\\session_termination.ps1",
                "s3LogEnabled": true
            }
        ],
        "waitingTime": 30
    }
}

This Session Scripts configuration file runs a PowerShell file in the User context while the session is stopping.

NOTE: The execution policy in PowerShell does not run unsigned scripts by default. You may need to modify the execution policy for your script to run.

Create the AppStream 2.0 image, and apply it to a fleet

The PowerShell script is now saved on the image builder, and the Sessions Script configuration JSON is updated. Now, start the Image Assistant application from the desktop of a Windows administrator. Specify any additional apps that you want this image to have, and continue to create the image as you normally would. After the image is created successfully, you can apply the image to a fleet, or create a new one.

NOTE: If there are any syntax errors in the JSON file or the specified scripts are not present at the locations defined, Image Assistant displays an error message. You cannot create an image until you resolve whatever caused the error.

However, at the same time, you can continue to configure the Athena database.

Parse the collected session information stored in Amazon S3

Now we must set up the Athena database to contain our session scripts output data.

  1. Open the Athena console.
  2. In the Query Editor, run the following query to create a new database.

CREATE DATABASE IF NOT EXISTS  <new_db_name>

In this example, our database is called as2_userdata_db.


With our database successfully created, we can now create a table with an Amazon S3 bucket as a data source, and .csv files as our data type. We also must create our table with the following column headers:

  • InstanceAWSRegion
  • StackName
  • FleetName
  • AccessMode
  • UserName
  • ImageARN
  • InstanceType
  • SessionID
  • StartDateTime
  • EndDateTime
  • SessionDurationInSeconds

The Amazon S3 bucket location that you specified in the session termination PowerShell script is the table’s data source. The data that is imported are the .csv files that the PowerShell script creates at the end of every session and uploads to the Amazon S3 bucket.

The following SQL query creates the table with the column names, and specifies the Amazon S3 bucket source for the data.

CREATE EXTERNAL TABLE IF NOT EXISTS <db_name>.<new_table_name> (
  `InstanceAWSRegion` string,
  `StackName` string,
  `FleetName` string,
  `AccessMode` string,
  `UserName` string,
  `ImageARN` string,
  `InstanceType` string,
  `SessionID` string,
  `StartDateTime` string,
  `EndDateTime` string,
  `SessionDurationInSeconds` int
)
ROW FORMAT SERDE 'org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe'
WITH SERDEPROPERTIES (
  'serialization.format' = ',',
  'field.delim' = ','
) LOCATION 's3://<S3_Location>'
TBLPROPERTIES ('has_encrypted_data'='false');

In this example, our new table name is useast1, in the database as2_userdata_db. Our Amazon S3 location address is s3://as2-example-demo-ss/us-east-1/. This is shown in the following example:

The Athena database and table have now been created.

Run a SQL query to retrieve the data

After the fleet with the sessions script image reaches a running state, log in to generate some .csv files that are stored in the Amazon S3 bucket, and data for our Athena database. Then return to the Athena console and use the following SQL query to see the collected session data:

SELECT * FROM "<db_name>"."<table_name>" limit 10;

Let’s see how long each user used AppStream 2.0:

SELECT username, stackname, SUM(SessionDurationInSeconds) as totalstreamtime FROM "<db_name>"."<table_name>" GROUP BY username, stackname

Conclusion

And that’s it! We now have:

  • A fleet deployed with an AppStream 2.0 session scripts image that provides output of the session details at session termination.
  • An Athena database that ingests the sessions details.
  • SQL queries that can be used to aggregate and return that data!