AWS for M&E Blog

2020 Resolution: Add user-generated content to your applications

Many AWS customers have started to allow user-generated content (UGC) within their applications to enhance users’ ability to connect with each other and express themselves individually. This video content usually takes the form of short “stories” (less than one minute) told in a personal way, and showcasing things like products, clothing, or real estate. Allowing UGC content increases the brand value of AWS customers as they focus on the end-user experience and a dynamic platform for the users.

There are several technical challenges to giving users the ability to submit UGC using your application or service. In this post, I’ll walk through the steps to create a simple and secure workflow to accept user-generated videos into your Amazon Simple Storage Service (Amazon S3) bucket.

This tutorial assumes users will submit video from mobile applications using the PUT method, with files that are smaller than 5 GB. I built this workflow using MacOS, so certain commands will need to be modified for Windows users.

Step-by-Step Tutorial

In this post, I’ll walk through the following steps:

  1. Create an S3 bucket with Transfer Acceleration enabled. (I’ll use the US West (Oregon) region.)
  2. Create a policy in AWS Identity and Access Management (IAM) for a role to use.
  3. Create a role in IAM that AWS Lambda will use to pre-sign URLs to accept .mp4 files.
  4. Create an S3 bucket policy
  5. Create a Lambda function to generate a pre-signed URL and attach the role.
  6. Test the function using cURL (or Putty on Windows) to upload a video through the command line.
  7. Create an Amazon Cognito User Pool & test user.
  8. Create an API Gateway Authorizer and GET method.
  9. Ensure the API is authenticating with the test user and is not open to the public.

If you want to skip ahead and not go through each step one by one, download this CloudFormation template to spin up this solution in your AWS account.

Overview of the UGC upload process you will build in this tutorial

Overview of the UGC upload process you will build in this tutorial

Step 1: Create a s3 bucket with Transfer Acceleration

Ensure the correct region is selected (US West (Oregon) in this example). Create a new S3 bucket to accept the user-generated videos. The bucket permission must remain default so the bucket contents are not publicly accessible.

Enable Transfer Acceleration under the bucket’s Properties tab. There is a cost associated with enabling this feature, but if you are accepting content globally, this functionality will provide a noticeable improvement for your users.

Take note that the endpoint changes when this feature is enabled.

Step 2: Create a policy in IAM using the following JSON example

In the following JSON code, replace BUCKET_NAME_HERE with your new bucket name and ACCOUNT_HERE with your AWS Account ID number. The Lambda policy needs to have S3 PutObject permission because it will assume a role with this policy attached when it signs the request. If this permission is not available to Lambda, the URL will respond with a 403 Error when a user tries to PUT the object.

    "Version": "2012-10-17",
    "Statement": [
            "Action": [
            "Resource": [
            "Effect": "Allow"
            "Action": [
            "Resource": [
            "Effect": "Allow"
            "Action": [
            "Resource": [
            "Effect": "Allow"

Next, navigate to IAM. Click Policies on the left and choose Create policy. Then, choose the JSON tab at the top of the page. Paste in the proceeding JSON code and choose Review policy. Finalize the Create Policy to match the following information:

  • Name: presign_url_put_policy
  • Description: This policy has strict permissions to only put to a single S3 bucket.

Click the Create policy button.

Step 3: Create a role & attach the policy

When creating the role, select Lambda as the service that will use the role.

Under Filter policies, search for the policy you just created “presign_url_put_policy”. Selecting the checkbox next to the policy will attach it to the role. Click the Next: Tags button, then Next: Review. Name the role “media-ugc-role” and click Create role.

Step 4: Create a bucket policy

In S3, go to the Permissions tab of your S3 bucket and choose Bucket Policy. Using the following JSON code, you will ALLOW the role you created to PUT mp4 files, and DENY all users from uploading anything that is not an mp4 file. Replace ACCOUNT_HERE with your AWS Account ID and BUCKET_NAME_HERE with your bucket name in two locations.

    "Version": "2012-10-17",
    "Id": "Policy1464968545158",
    "Statement": [
            "Sid": "Stmt1464968483619",
            "Effect": "Allow",
            "Principal": {
                "AWS": "arn:aws:iam::ACCOUNT_HERE:role/media-ugc-role"
            "Action": "s3:PutObject",
            "Resource": [
            "Sid": "Stmt1464968483619",
            "Effect": "Deny",
            "Principal": "*",
            "Action": "s3:PutObject",
            "NotResource": [

Step 5: Create a Lambda function to generate a pre-signed URL

Confirm you are in the same region as your bucket, then create a Lambda function called PresignedUrlGenerator
using the Author from Scratch option.

Add your environment variables: FILE_EXTENSION, REGION, and BUCKET_NAME

Under Permissions, make sure Lambda is using the correct role. Select Use an existing role and select the name of the role you created from the dropdown list.

With Python 3.7 selected, paste the following Python code in your

import boto3
import os
import random
import string
import json
import uuid

from botocore.client import Config

def lambda_handler(event, context):
    response = create_presigned_url(os.environ['BUCKET_NAME'], make_random_string() + "." + os.environ['FILE_EXTENSION'])
    return json.loads(response)

# ==================================================================================
# Function: create_presigned_url
# Purpose: Generate a presigned URL to upload an S3 object using PUT and Accelerated Transfer
# Parameters: 
#   bucket_name: string
#   s3_key: string
#   expiration: Time in seconds for the presigned URL to remain valid
# Returns:
#   Presigned URL as string. If error, returns None.
# ================================================================================== 
def create_presigned_url(bucket_name, s3_key, expiration=1200):
    s3 = boto3.client('s3', region_name=os.environ['REGION'], config = Config(signature_version = 's3v4', s3={'addressing_style': 'virtual', 'use_accelerate_endpoint':'true'}))
        url = s3.generate_presigned_url('put_object', 
            Params = {'Bucket': bucket_name, 'Key':s3_key}, 
            ExpiresIn = expiration, 
        json_output = '{"url":"' + url + '"}'    
        return json_output
    except Exception as e:
        print("EXCEPTION: When getting presigned url > " + str(e))
        return None
# ==================================================================================
# Function: make_random_string
# Purpose: Used as a GUID for creating local files.
# Parameters: 
#   none
# ==================================================================================
def make_random_string():
    return str(uuid.uuid4())

Next, create a test event in your Lambda function and run the test just to make sure the code is working. Select Test in the upper right of the page. In the pop-up window, enter an event name like “TestEvent” and click Create. Select Test again to run the function.

When it runs, the output should be a JSON snippet including a long URL. Copy the URL.

Step 6: Test the function by uploading an MP4 file using cURL

Now you can upload a file to test the solution.

Replace the URL in the following command with the one you generated from Lambda. In Terminal, navigate to the directory of your test file and run the command.

curl --upload-file testfile.mp4 ''

You should now be able to see the MP4 file in the S3 bucket.

Step 7: Create a user pool in Amazon Cognito

In the Console, navigate to Amazon Cognito and choose Manage User Pools. Select Create a user pool.

For Pool name, use “Media-UGC-Pool”. Click the Review Defaults button. The default settings can be used.

Now that your pool has been created, select Users and Groups under General Settings. Create a test user with the name “testuser” and generate a temporary password. You will need to jump through some hoops to change the user’s password from the temporary password.

Under App clients, create a new App client using the name “Media-UGC-App-Client” and set the Refresh token expiration (days) to 30. Match the following settings, making sure that Generate client secret and ALLOW_ADMIN_USER_PASSWORD_AUTH are not checked. Ensure the ALLOW_CUSTOM_AUTH, ADMIN_NO_SRP_AUTH, and ALLOW_USER_PASSWORD_AUTH settings are checked.

There may be additional options available under Auth Flows Configuration not depicted in the following screenshot. Those options can remain unchecked.

You now have an App Client ID to emulate an application.

Back in Terminal, use the AWS Command Line Interface (AWS CLI) to update the temporary password of your Cognito user. The Pool ID is available under General Settings in Cognito.

IMPORTANT: Use an up-to-date version of the AWS CLI to run the following code after replacing POOL_ID_HERE. This step will not function with older versions of the CLI.

aws cognito-idp admin-set-user-password --user-pool-id POOL_ID_HERE --username testuser --password Test1234* --permanent

If this worked, the Account Status for your test user should read CONFIRMED.

If you need to update your CLI, reference the Installing the AWS CLI documentation topic, or run the following in Terminal:

sudo pip install awscli —force-reinstall —upgrade

STEP 8: Create an API Gateway method to call the Lambda function

From the Amazon API Gateway service, create a new REST API that is edge optimized. If this is your first time creating an API, you may see a splash screen. If so, choose the Build button under the REST API section.

Under Choose the protocol, select REST. Under Create new API, select New API. Enter the following settings:

  • API Name: Media-UGC-API
  • Description: Presigned URLs
  • Endpoint Type: Edge optimized

Next, create an Authorizer by selecting Authorizers from the list under API: Media-UGC-API and then selecting the Create New Authorizer button.

Under your new API, select Resources. Click the Actions tab and choose Create Method to generate a GET method.

Set up the GET method to match the us-west-2 region and your Lambda Function. Ensure Use Lambda Proxy Integration remains unchecked. Check the Use Default Timeout option.

Add the Authorization header as required in Method Request by clicking on the word GET and selecting the Method Request section. Expand the HTTP Request Headers subsection and type in “Authorization.” Click the checkmark on the right to confirm the creation of the header. A Required checkbox will then appear. Toggle the Required checkbox on.

Under the GET settings, choose the Cognito_Group for Authorization.

Your GET should match the following screenshot. In the screenshot, Authorization reads COGNITO_USERS_POOLS and API Key reads “Not required.”

Next, select the Actions tab and choose Deploy API to create a new stage. In the popup, enter the Stage name as “prod” and Stage description as “Production.” For Deployment description, enter “Production V1.” Click the Deploy button.

Next, you can test the Invoke URL in the any web browser on your local machine to confirm the data cannot be accessed because you have not authenticated yourself. The return message should read “Unauthorized.”

Step 9: Authenticate Your End Users

The final step is to test our workflow by logging in the test user and receiving back valid JSON from your Lambda function. In this example, I am using cURL, but you can achieve this using a GUI like Rested.

Create a file on your desktop called aws-auth-data.json with the following code. Update ClientId with the Cognito User Pool App Client ID that you generated in Step 7.

   "AuthParameters" : {
      "USERNAME" : "testuser",
      "PASSWORD" : "Test1234*"
   "AuthFlow" : "USER_PASSWORD_AUTH",
   "ClientId" : "3n0f5nl5qea09l98th09sk7abe"

Open up Terminal and navigate to your Desktop.

curl -X POST --data @aws-auth-data.json \
-H 'X-Amz-Target: AWSCognitoIdentityProviderService.InitiateAuth' \
-H 'Content-Type: application/x-amz-json-1.1' \

When this is successful, you will receive a large JSON response with your Tokens. Carefully select the token called IdToken (this example does not use AccessToken).

Now you can use the valid IdToken for the next hour to generate responses from Lambda.

curl -H 'Accept: application/json' -H "Authorization: eyJraWQiOiJmV3IwNXZvRmZoVk81ZStyWGdnWFdyRUdQUlpQRE52b2orOERGSDNTMTlvPSIsImFsZyI6IlJTMjU2In0.eyJzdWIiOiJkNDg1ZDRlMS1jZGEwLTRlYzQtODI0OS1mNjAzNGZhM2RiOTUiLCJlbWFpbF92ZXJpZmllZCI6dHJ1ZSwiaXNzIjoiaHR0cHM6XC9cL2NvZ25pdG8taWRwLnVzLXdlc3QtMi5hbWF6b25hd3MuY29tXC91cy13ZXN0LTJfdFU2YVRBbjN6IiwicGhvbmVfbnVtYmVyX3ZlcmlmaWVkIjp0cnVlLCJjb2duaXRvOnVzZXJuYW1lIjoiYnJpYW5iZWRhcmQiLCJhdWQiOiIzbjBmNW5sNXFlYTA5bDk4dGgwOXNrN2FiZSIsImV2ZW50X2lkIjoiY2NkMzg0MzktMTY3NS00Mjg5LTliZDktZDBjMzg5NGUyMGVmIiwidG9rZW5fdXNlIjoiaWQiLCJhdXRoX3RpbWUiOjE1NzM2NjU5NzMsInBob25lX251bWJlciI6IisxMzIzODM5MzIyNSIsImV4cCI6MTU3MzY2OTU3MywiaWF0IjoxNTczNjY1OTczLCJlbWFpbCI6ImJlZGFyZGJiQGFtYXpvbi5jb20ifQ.wbCnPYdM6v7OqACDhJV1yC_KYO7Jp3Y8IkjbKFz8v-agITBFxXbZiMmU0zBywqnkLcNn9ezQcbcJX1l2fzVl12CJlnWJLxQuAEgrne1EO3nZxVfMZrUz7gfpDgYyQjAmjOIszdzqWBHS1BkfO1BKMQgjpow4fodNLhegSBkz6SgCZODGQpffNMVWDUe_-HcoXD_KqBkGVXRDgXGOtwcC50jhY0ke9BtE9wCKlhtXQg-6_KLmsq-Z_-IIUofbiDZSXqMH_lPpGM956pYCkNpt1lcmM9JmlCagNifDlXG-zr3Jqii4n-14JBKO5zbEF2PdOe1J4L4rgIguf3fA6ycyWw"

The following screenshot shows a similar call using Rested showing the Response Header of 200 OK and the Response Body that includes your JSON.

Clean up

In order to avoid any ongoing costs, please ensure to remove the API you created in Amazon API Gateway, the User Pool from Amazon Cognito, the bucket in Amazon S3 as well as the function inside AWS Lambda.

Bonus Exploration

During the Lambda invocation, store the generated GUID and user information in an Amazon DynamoDB table to ensure you know which user uploaded which file. Now you can integrate this workflow into the VOD Solution from AWS to prepare videos for delivery and playback by other users.