AWS Cloud Operations & Migrations Blog

Streamline Automation with Outbound Webhooks for AWS Systems Manager Runbooks

Automation runbooks let you define a set of actions that automate various operations in your AWS environment. Runbooks allow our customers to simply configure automation workflows that they can execute based on either events or a scheduled cadence. These workflows commonly require integration with third-party systems, such as Slack, Jira, and ServiceNow. As of January 2022, AWS Systems Manager Automation lets you streamline automation with support for outbound webhooks. This added functionality enables you to extend your runbooks to communicate with numerous operational tools and applications within your organization.

Solution overview

This post shows you a customer example of a call center operator that used outbound webhooks to integrate with Slack, and how to implement a similar solution in your own environment. By utilizing outbound webhooks and Systems Manager Automation, this customer could automate a daily leaderboard that congratulates the top three call center associates at the end of the day, based on the average customer satisfaction (CSAT) score. This process was done manually before this solution, and this new solution creates a fun gamified experience for the associates, all while saving managers time and effort.

The workflow of the architecture that you will create is as follows:

  1. Systems Manager Automation executes your runbook daily.
  2. The Automation runbook executes a Python script that queries Amazon Athena for the top three performing call center associates for that day based on avg CSAT.
  3. The runbook invokes a webhook for your Slack channel workflow and passes text for the workflow to be published in the associate Slack channel.
  4. The top three associates are congratulated on their performance in the Slack channel.

The first step queries Athena and gets the top three performing call center associates for that day based on customer satisfaction scores. The second step passes this information as text to a Slack webhook integration. The webhook integration triggers a Slack workflow that publishes the received text and a congratulatory message to a Slack channel for the call center associates.

The walkthrough provides instructions for implementing the call center operator’s architecture that we described at the beginning of the post.

In this walkthrough, you will:

  1. Create a workflow in Slack to receive your outbound webhook from AWS that publishes a message to a Slack channel.
  2. Create a generic webhook integration in Systems Manager Automation based on your Slack workflow.
  3. Create an Amazon Simple Storage Service (Amazon S3) bucket and Athena table and database to query associate data.
  4. Use an Automation runbook to define a Python script that executes your Athena query to get the top three performing associates and then invokes your webhook to publish the results to a Slack channel.

Prerequisites

For this walkthrough, you should have the following prerequisites:

  • An AWS account and permissions to create and execute Systems Manager Automation runbooks, create S3 buckets, and create Athena databases and tables.
  • A service role for Automation that has the AmazonSSMAutomationRole AWS managed policy attached to it, and permissions to read and write from Amazon S3 and execute Athena queries. In addition to the AmazonSSMAutomationRole AWS managed policy, here is an example of a policy for executing Athena queries. You should always follow the principle of least privilege whenever possible when creating AWS Identity and Access Management (IAM) roles and policies.
{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "glue:BatchCreatePartition",
                "glue:GetPartitions",
                "glue:UpdateTable",
                "glue:CreatePartition",
                "glue:UpdatePartition",
                "glue:UpdateDatabase",
                "glue:CreateTable",
                "glue:GetTables",
                "glue:BatchGetPartition",
                "glue:GetDatabases",
                "glue:GetTable",
                "glue:GetDatabase",
                "glue:GetPartition"
            ],
            "Resource": [
                "arn:aws:glue:region:accountid:database/associate_data_db",
                "arn:aws:glue:region:accountid:catalog",
                "arn:aws:glue:region:accountid:table/associate_data_db/associate_data_table"
            ]
        },
        {
            "Effect": "Allow",
            "Action": [
                "athena:UpdateDataCatalog",
                "athena:GetTableMetadata",
                "athena:StartQueryExecution",
                "athena:GetQueryResults",
                "athena:GetDatabase",
                "athena:GetDataCatalog",
                "athena:DeletePreparedStatement",
                "athena:DeleteNamedQuery",
                "athena:GetNamedQuery",
                "athena:ListQueryExecutions",
                "athena:GetWorkGroup",
                "athena:UpdateNamedQuery",
                "athena:StopQueryExecution",
                "athena:GetPreparedStatement",
                "athena:ListTagsForResource",
                "athena:CreateNamedQuery",
                "athena:GetQueryExecution",
                "athena:BatchGetNamedQuery",
                "athena:ListTableMetadata",
                "athena:BatchGetQueryExecution"
            ],
            "Resource": [
                "arn:aws:athena:region:accountid:datacatalog/AwsDataCatalog",
                "arn:aws:athena:region:accountid::workgroup/primary"
            ]
        },
        {
            "Effect": "Allow",
            "Action": "athena:ListEngineVersions",
            "Resource": "*"
        },
        {
            "Effect": "Allow",
            "Action": [
                "s3:GetObject",
                "s3:ListBucket",
                "s3:PutObject",
                "s3:GetBucketLocation"
            ],
            "Resource": [
                "arn:aws:s3:::athena-query-results-identifier/*",
                "arn:aws:s3:::athena-query-results-identifier ",
                "arn:aws:s3:::associate-data-identifier/*",
                "arn:aws:s3:::associate-data-identifier"
            ]
        }
    ]
}

Create a workflow in Slack

You will need a Slack channel for the first step, and you can use an existing channel or create a new channel for this walkthrough.

To create a workflow in Slack, navigate to your Slack channel.

  1. Log in to Slack and navigate to your Slack channel, you will be integrating your webhook with this Slack channel.
  2. Open the Workflow Builder, and select Create.
  3. Name your workflow. For this walkthrough, the name used for the workflow is top3associates.
  4. Select Webhook as the way to start the workflow, and select Add Variable.
  5. For the Key, type in text, ensure that the Data type is Text, and then select Next.
  6. Now that you have created your workflow, select Add Step and select Add next to Send a message.
  7. From the drop-down under Send this message to, select the channel you navigated to in Step 1.
  8. Select Insert a variable below the Message text box, and select text.
  9. Above the text variable, copy and paste the following text:
:alphabet-white-c::alphabet-white-o::alphabet-white-n::alphabet-white-g::alphabet-white-r::alphabet-white-a::alphabet-white-t::alphabet-white-u::alphabet-white-l::alphabet-white-a::alphabet-white-t::alphabet-white-i::alphabet-white-o::alphabet-white-n::alphabet-white-s:
To our associates with the top 3 customer satisfaction scores!
  1. Before selecting save, your step should be similar to the following:

The message text should contain the text “Congratulations to our associates with the top 3 customer satisfaction scores!” and the following three associate IDs and their associated average customer satisfaction scores.

  1. Select Save, and then publish your workflow.
  2. Copy the provided URL that can be used for receiving HTTP POST requests to your clipboard, as we will use this in the next section.

Create an integration within Systems Manager Automation

To create an integration within Systems Manager Automation, first log in to your AWS Account.

  1. Navigate to Systems Manager by searching for service in the AWS Management Console.
  2. Under Change Management, select Automation.
  3. Select Integrations > Add integration > Generic Webhook.
  4. Give your Integration a Name, paste your Webhook URL into the associated field and select Add integration. For this walkthrough, the integration name is associate-slack-webhook.

The two fields, Integration Name and Webhook URL, should be filled out appropriately, and the remaining fields left blank or existing defaults kept the same.

Create the S3 bucket and upload call center associate data

Use an S3 bucket to store your call center associate data.

  1. Create an S3 bucket using the Amazon S3 console. For this walkthrough, our bucket name is associate-data-1234.
  2. Copy and paste the following fictitious data into a text editor, and save the file locally as associate_data.csv.
168875,2022-03-30 10:29:24,4.5,Inbound
120091,2022-03-29 09:33:45,4.72,Inbound
120091,2022-03-30 16:11:55,3.3,Outbound
168875,2022-03-28 12:12:02,2.1,Inbound
168875,2022-03-30 12:52:14,4.9,Inbound
171111,2022-03-28 12:32:15,5,Outbound
564332,2022-03-30 11:15:14,5,Outbound
564332,2022-03-30 13:02:04,2.7,Inbound
721101,2022-03-29 17:01:05,3,Inbound
721101,2022-03-30 11:33:06,2,Outbound
721101,2022-03-30 06:55:07,2,Outbound
564332,2022-03-29 07:56:22,5,Outbound
564332,2022-03-30 13:34:11,4.1,Inbound
  1. Upload the associate_data.csv file to your S3 bucket.

Add database and table in Athena

Configure Athena so that you can query the data that you uploaded to Amazon S3.

  1. Navigate to the Athena console within your AWS account.
  2. Open the Query editor.
  3. Select the Create drop-down next to Tables and views, and select Create a table from data source > S3 bucket data.
  4. Enter a Table name, and under Database configuration, select Create a database. For this walkthrough, the Table name is associate_data_table, and the database name is associate_data_db.
  5.  Select the S3 bucket location we created in the previous section under Dataset.
  6. Select CSV for Data format.
  7. Under Column details, select Bulk add columns, copy and paste in the following, and select Add.

associateid, datetime, csat, calltype

  1.  For the columns associateid, datetime, csat, calltype, set the Column types to string, timestamp, float, and string, respectively.
  2. Select Create Table.

Create Automation document

Now you will bring everything together with an Automation document so that you can execute your workflow seamlessly.

  1. Navigate to the Systems Manager console and select Automation > Execute Automation > Create Document.
  2. Give your document a name. For this walkthrough, the document name is top3-associates-automation.
  3. Copy and paste the ARN of the service role for Automation that you created in the Prerequisites section of this post.
  4. Within the document builder, name your first step Run_Script, and for Action type, select Run a script (make sure that your step here matches, as you will use this step in subsequent steps).
  5. Select Python 3.8 as the Runtime, and name the Handler top3_associates.
  6. Copy and paste the following Python script into the Script section:
import boto3
from time import sleep

# Instantiate boto3 client to interact with Athena service.
client = boto3.client('athena')

def top3_associates(events, context):

	# Below query selects the top 3 associates by daily average csat score.

	query = '''SELECT associateid, round(avg(csat), 2) as avg_csat FROM associate_data_table
	WHERE cast(datetime as date) = cast('2022-03-30' as date)
	GROUP BY associateid 
	ORDER BY avg_csat DESC LIMIT 3'''
	# change WHERE clause above to
	# WHERE cast(datetime as date) = current_date
	# when use case is to run script daily for current data.

	# Begin query execution.
	response = begin_query(query)
	query_execution_id = response['QueryExecutionId']

	# Wait for query execution to complete.
	results = get_results(query_execution_id)

	# Format the rows into a string.
	formatted_results = format_results(results)

	return formatted_results

def begin_query(query_string):

	response = client.start_query_execution(QueryString=query_string, 
		QueryExecutionContext={'Database': 'associate_data_db'},
		ResultConfiguration={'OutputLocation' : 's3://athena-query-results-351617/'})

	return response

def get_results(query_execution_id):

	# Until the query has completed executing or stopped for another reason
	# print out the state of the query.
	print('QueryExecutionStatus')
	print('------------------------------')

	describe_response = client.get_query_execution(QueryExecutionId=query_execution_id)


	query_execution_state = describe_response['QueryExecution']['Status']['State']
	print(query_execution_state)
    
	while query_execution_state not in ('FAILED', 'CANCELLED', 'SUCCEEDED'):

		
		describe_response = client.get_query_execution(QueryExecutionId=query_execution_id)
	

		query_execution_state = describe_response['QueryExecution']['Status']['State']

		print(query_execution_state)
		# For the purposes of the example we use a fixed back off.
		# In a production system it is recommended to use exponential backoff where possible.
		sleep(1)

	# Get the results of the query.
	results = client.get_query_results(QueryExecutionId=query_execution_id)

	return results

def format_results(results):
	# Get all rows except the header.
	rows = results['ResultSet']['Rows'][1:]

	# Empty string for storing the results that will be fed to the Webhook.
	associate_string = ""

	# Combine results into single string denoting associateids and avg csat for top 3 associates.
	for row in rows:
		associate_string = associate_string + "Associate " + row['Data'][0]['VarCharValue'] + " (Average Customer Satisfaction: " + row['Data'][1]['VarCharValue']  + ")\n"

	return associate_string

Under Outputs, add an output Name equal to payload, Selector equal to $.Payload, and Type equal to String.

  1.  Select Add step.
  2. Name the second step Invoke_Webhook.
  3. For Action type, select Invoke a webhook integration.
  4. Select the name under IntegrationName of the integration you created in the Create an integration within AWS Systems Manager Automation step of this walkthrough.
  5. Under Additional Inputs, select Body for the Input name and paste the following into the Input value field.
{ "text" : "{{Run_Script.payload}}"}
  1. Select Create Automation

Execute Automation document to invoke webhook

Now that you have your architecture created, you can execute our Automation document and see the results in our Slack channel.

  1. Navigate to the Systems Manager console and select Automation > Execute Automation > Owned by me.
  2. Select top3-associates-automation, and then select Next > Execute.
  3. If your execution succeeds, then the Execution summary will be similar to the following:

The summary lists the Overall status as Success, and the two executed steps, Run_Script and Invoke_Webhook, also have the Success status.

  1. In addition, your Slack channel should receive a message similar to the following:

The message text should contain the text “Congratulations to our associates with the top 3 customer satisfaction scores!” and the following three associate IDs and their associated average customer satisfaction scores.

  1. As a final step, you can create an automation trigger with State Manager that executes your Automation runbook daily. If your use case requires running the above Python script daily, then see the comments in the script for modifying the script to accommodate daily data.

Cleaning up

To avoid incurring future charges, delete the resources that you created in this walkthrough as follows:

Conclusion

This post highlights an exciting new feature of Systems Manager Automation: support for outbound webhooks. Using this feature, you can streamline automation within your organization and integrate with any external application or system that supports generic webhooks. The possibilities are limitless, but in this post, we focused on a customer who has creatively used this new feature to create an automated leaderboard workflow for their call center associates. Therefore, they saved time and effort for their managers while creating a fun gamified experience for the associates.

I encourage you to extend outbound webhook integration further within your organization and explore integrating with common applications, such as Jira and ServiceNow. The webhooks support within Systems Manager Automation means that you can continue minimizing the tooling required for third-party integration. Furthermore, I am excited to see the creative ways you’ll use this functionality in the future.

Author:

Ryan Lempka

Ryan Lempka is a Senior Solutions Architect at AWS, where he helps his customers work backwards from business objectives to develop solutions on AWS. He has deep experience in business strategy, IT systems management, and data science. Ryan is dedicated to being a lifelong learner, and enjoys challenging himself every day to learn something new.