AWS Government, Education, & Nonprofits Blog

Integrating Amazon AppStream 2.0 with your Learning Management System

Amazon AppStream 2.0 is a fully managed application streaming service that lets students access the applications they need for class through a browser. It doesn’t matter whether they’re using Macs, Chromebooks, or PCs, or whether they’re in the classroom, the library, a cafe, or at home. For example, Cornell University uses AppStream 2.0 to deliver CAD/CAM applications to its engineering students without the need for computer labs.

Although AppStream 2.0 makes it simple for students to access the applications they need using desktop applications, we know many schools use learning management systems (LMS) for their students. There’s a lot more to learning than desktop applications. In this blog post, we walk through how you can also integrate AppStream 2.0 with your LMS, so your students can access what they need through one portal, including their desktop applications.

We’ll use AWS Chalice and Learning Tools Interoperability (LTI) to integrate Amazon AppStream 2.0 with your LMS. We will build an AWS Lambda function to validate the LTI request and return an AppStream 2.0 Streaming URL for the lab.

Let’s begin with a quick overview of the flow:

  1. The student clicks a link in the LMS to load a lab.
  2. The LMS generates an LTI OAuth 1.0 signature.
  3. The user is redirected to AWS Lambda.
  4. AWS Lambda validates the signature and generates a streaming URL.
  5. The user is redirected to AppStream 2.0.

The remainder of this post focuses on the implementation of step four.

Creating the Lambda function

Let’s begin by creating a new Chalice project using the chalice command line.

Before you can deploy an application, be sure you have credentials configured. If you have previously configured your machine to run Boto3 (the AWS SDK for Python) or the AWS CLI, then you can skip this step. If this is your first time configuring credentials for AWS, you can follow these steps to quickly get started.

Note: You might want to create a virtual environment to complete the remaining tasks in this post.

First, install chalice and create a new project.

$ sudo pip install chalice boto3
$ chalice new-project appstream-lti && cd appstream-lti
$ cat app.py

As you can see, this creates a simple hello world application with a few sample functions.

Next, we are going to install pyLTI. PyLTI is a Python implementation of the LTI specification maintained by MIT Office of Digital Learning. Note that I am installing pyLTI in the project directory using the “-t” option. This ensures the library is included in the package that Chalice uploads to AWS Lambda.

$ pip install pylti -t ./vendor/

Then, we can open app.py in our favorite editor and replace the hello world application with the following:

import os, boto3
from chalice import Chalice, Response
from pylti.chalice import lti
from urlparse import parse_qs

app = Chalice(app_name='appstream-lab')

@app.route('/', methods=['POST'], content_types=['application/x-www-form-urlencoded'])
@lti(request='initial', app=app)
def index(lti=lti):
    client = boto3.client('appstream')
    params = parse_qs(app.current_request.raw_body.decode())
    response = client.create_streaming_url(
        StackName=params['custom_stack'][0],
        FleetName=params['custom_fleet'][0],
        UserId=lti.name
    )
    print(lti.name)
    url = response['StreamingURL']
    body = "<p><a href='%s'>If you are not automatically redirected, click here.</a></p>" % url
    return Response(body=body, status_code=301, headers={'Location': url})

Let’s walk through the implementation briefly.

  1. The @app.route decorator tells Chalice which requests to send to our function.
  2. The @lti decorator tells pyLTI to create a new session and validate the signature.
  3. The call to create_streaming_url generates a pre-signed URL for AppStream 2.0.
  4. We return an HTTP 301 to redirect the user to the AppStream 2.0 streaming URL.

Before we deploy, we need to set up a shared secret for the LTI OAuth signature. We are going to store the secret in an environment variable. Chalice looks for an environment variable prefixed with “CONSUMER_KEY_SECRET_”. The key and secret can be anything you want, as long as both sides have the same values.

Open .chalice/config.json in your favorite editor and add the key and secret. For example, the config below is using a key of AppStream and a secret of SuperSecretString.

{
  "stages": {
    "dev": {
      "api_gateway_stage": "api"
    }
  },
  "environment_variables": {
    "CONSUMER_KEY_SECRET_AppStream": "SuperSecretString"
  },  
  "version": "2.0",
  "app_name": "appstream-lab"
}

Finally, we can deploy the function to AWS Lambda using the chalice deploy command. Take note of the Rest API URL returned from this command. You will need it when you configure your LMS.

$ chalice deploy

Configuring your LMS

The final step is to configure your LMS. I’m going to use Moodle in the example, but you can use any LMS that supports LTI. The following screenshot is the Moodle configuration screen.

  • Tool URL is the URL returned from chalice deploy command earlier. It must include the trailing slash.
  • Consumer Key and Shared Secret must match the environment variable you created earlier.
  • Custom parameters must include the name of the AppStream 2.0 Stack and Fleet to launch.

Conclusion

That’s it! Your students can now access the applications they need for class through their LMS, including their desktop applications. You can demo some sample applications at no cost, and visit our Getting Started guide to launch in 10 steps.