Desktop and Application Streaming
Storing AppStream 2.0 Windows Event logs in S3 with IAM roles and Windows Task Scheduler
Recently, AWS announced support for using IAM roles with image builders and fleets. Now, Amazon AppStream 2.0 streaming instances can take advantage of the flexibility and security of role-based access to AWS API actions.
One useful application of this feature is easily uploading and storing Windows Event logs for troubleshooting. Because of the ephemeral and managed nature of the AppStream 2.0 streaming instances, Windows Event logs are not automatically captured and stored in a meaningful way.
Using the new IAM role assertion feature with a PowerShell script that runs periodically, you can upload Windows Event logs to Amazon S3. Store them by the user name provided to AppStream 2.0, and make it much easier to locate the event logs needed for troubleshooting end-user issues.
Overview
This post walks you through the resource setup, and the scheduled task to make it recur. In this walkthrough, you complete the following tasks:
- Create an IAM policy and role to allow the AppStream 2.0 streaming instance access to the S3 bucket.
- Configure a PowerShell script and schedule task on AppStream 2.0 image builder.
- Create a custom image from AppStream 2.0 image builder.
- Create a stack and fleet from the custom image.
Prerequisites
For this walkthrough, you should have the following prerequisites:
- An AWS account
- An existing S3 bucket
- A VPC with internet access or a service endpoint configured for S3
- An AppStream 2.0 image builder
For more information, see Using an IAM Role to Grant Permissions to Applications and Scripts Running on AppStream 2.0 Streaming Instances.
Create the IAM policy and IAM role
In this step, you create the IAM policy that provides permission to the S3 bucket, then attach it to an IAM role that your AppStream 2.0 streaming instances can assume.
- Navigate to the IAM console.
- In the navigation pane, choose Policies, Create policy.
- Choose the JSON tab. For Policy document, copy and paste the following JSON policy appropriate to your log destination.
- Replace <s3-bucket-name> with the name of your existing bucket.
- Choose Review policy.
- Enter a name of your choosing and choose Create policy.
S3 bucket policy document example
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": "s3:PutObject",
"Resource": "arn:aws:s3:::<s3-bucket-name>/*"
}
]
}
Now that the IAM policy has been created, you can create the IAM role for AppStream 2.0 to assume with the policy that you just created.
- Open the IAM console.
- In the navigation pane, choose Roles, Create Role.
- For Select type of trusted entity, keep AWS service selected.
- Choose AppStream 2.0, Next: Permissions.
- In the Filter policies search box, type the name of the policy created in the previous step, and select the check box next to the policy name.
- Choose Next: Tags. Although you can specify a tag for the policy, a tag is not required.
- Choose Next: Review.
- Enter a name for your role to help you identify it, and choose Create role.
Configure the PowerShell script and schedule the task
- Connect to your AppStream 2.0 image builder as the administrator.
- For ease of organization, create a folder somewhere on the root drive of the image builder to store the PowerShell script.
- Open your text editor of choice, and copy the following script, replacing <Name of S3 Bucket> and <S3 Region> with the appropriate information, then save the script to the previously created folder.
- The PowerShell script runs waiting for an active user session on the instance. When a user connects, the script reads the environment variable AppStream_UserName from the user context to determine the S3 key name. Windows Event logs are read starting with either the system startup time or the last runtime, whichever is more recent. After all events have been uploaded, the script sleeps for a period of time defined by the variable $PollingInterval in seconds. Then, it loops back to check for Windows Event logs after the $LastRun timestamp.
PowerShell script
#Define which logs and event types are shipped how often
$S3Bucket = "<s3-bucket-name>" #Replace with the S3 bucket created for storing logs.
$bucketRegion = "<S3 Region>" #Replace with the S3 bucket region.
$eventLogs = @("application","system") #Define which Windows Event logs are shipped in a comma-separated list
$EventTypes = @("error","warning") #Define which event types are shipped in a comma-separated list
$PollingInterval = 60 #Time to wait between checking for new Windows Event log events in seconds
#Add PS drive for HKEY_USERS to access the user-context environment variables
New-PSDrive -PSProvider Registry -Name HKU -Root HKEY_USERS
#While loop to run while the instance is up
while ($true){
#Check if user is connected, else sleep
if ((Get-WmiObject win32_computersystem).username) {
#Get logged in username
$ActiveUser = (Get-WmiObject win32_computersystem).username.split("\\")[1]
#Set up filter string for getting logged-in user SID
$filterstring = "name = '" + $ActiveUser + "'"
#Get logged-in user SID
$ActiveUserSID = (Get-WmiObject win32_useraccount -Filter $filterstring).SID
#Check if AppStream_UserName environment variable is set yet
if ((get-ItemProperty -Path HKU:\$ActiveUserSID\Environment -Name AppStream_UserName -ErrorAction SilentlyContinue)){
#Set variable to be used in the S3 key name from the registry value
$AppStream_UserName = (get-ItemProperty -Path HKU:\$ActiveUserSID\Environment -Name AppStream_UserName -ErrorAction SilentlyContinue).AppStream_UserName
#Determine what the last log event collected should be, either from startup, or from the last run.
$windowsStart = (Get-WinEvent -FilterHashtable @{Logname='System';ID=12;ProviderName='Microsoft-Windows-Kernel-General'} | Select-Object -first 1).TimeCreated
if ($LastRun) {
if ($LastRun -gt $windowsStart) {$LogDate = $LastRun} else {$LogDate = $windowsStart}
} else {
$LogDate = $windowsStart
}
#Write current time as the last-run value for the next run
$LastRun = get-date
#Process logs
foreach ($logName in $eventLogs) {
$events = Get-Eventlog -LogName $logName -EntryType $EventTypes -After ($LogDate)
foreach ($event in $events) {
$KeyName = $AppStream_UserName + "/" + $logName + "/" + $event.EntryType + "/" + $event.TimeGenerated.ToString("dd-MM-yyyy HH:MM:ss") + "-EventID-" + $event.EventID + ".JSON"
$LogJSON = $event | ConvertTo-Json
Write-S3Object -BucketName $S3Bucket -Key $KeyName -Content $LogJSON -Region $bucketRegion -ProfileName appstream_machine_role
}
}
}
}
#Wait time between upload cycles in seconds
Start-Sleep -Seconds $PollingInterval
}
- To add the task scheduler job for running the PowerShell script at machine start, open an admin command prompt, and run the following command. Replace <TaskName> with a meaningful identifier, and <Full-path-to-ps1-file> with the full path to the PowerShell script created in the previous step.
schtasks /create /tn "<TaskName>" /tr "powershell.exe -file "<Full-path-to-ps1-file>" /sc ONSTART /ru system
Create the AppStream 2.0 custom image, fleet, and stack
- After configuring the task scheduler job for the PowerShell script, create a custom image through Image Assistant on the image builder. For more information, see Tutorial: Create a Custom AppStream 2.0 Image by Using the AppStream 2.0 Console.
- When the custom image is available, create a stack and fleet. While creating the fleet, select the custom image created in the last step. Ensure that the fleet has internet access or access to the appropriate VPC endpoint, and assign the IAM role created in step one to the fleet. For more information, see Tutorial: Create an AppStream 2.0 Fleet and Stack.
- Finally, test launch your applications from the stack created. When a user connects to a fleet instance, the script starts uploading logs under the user name to the configured destination.
Log example
Cleaning up
To avoid incurring future charges, stop and delete unused AppStream 2.0 resources and unneeded events from the S3 bucket.
Conclusion
In this post, you walked through the steps to push Windows Event logs from AppStream 2.0 streaming instances to S3 for storage, troubleshooting, and analytics. Next, you can continue learning more about S3 using Amazon Athena.