AWS for Industries

Approving AWS services for GxP workloads

In this blog post, we describe the first step of a process for qualifying AWS services for use as part of GxP workloads, sometimes referred to in the industry as “whitelisting” services. AWS customers with GxP compliance requirements might want to control access to the AWS services their developers use. During the supplier assessment of AWS, customers approve AWS based on a combination of AWS’s certifications, quality system, and control information, and therefore may want to ensure that only the services in scope of those certifications, where applicable, are available to developers of GxP workloads. However, customers are also keen to balance such controls with a desire to promote innovation and developer freedom so we will also touch on how this process fits with a typical SDLC process without becoming a blocker.

Although AWS is not directly subject to GxP regulations, compliance, like security, is a shared responsibility between AWS and our customers. Gaining approval to use a service for many customers is a part of architecture governance but it’s also an important first step for customers subject to GxP regulations. From a regulatory perspective, it’s the part of a qualification process that checks AWS has satisfied their part of the shared responsibility model at the service level. This is done by checking that the service is included in our various certifications such as ISO, SOC and HIPAA.

Customers often consider multiple regulations when approving a service for use by development teams. Perhaps they allow all services to be used in sandbox accounts, but restrict the services available in an account to only HIPAA-eligible services if the application is subject to HIPAA regulations. Customers often use a combination of AWS certifications to make their approval decisions. The enforcement of this approval is implemented through the use of AWS Organizations and Service Control Policies. Refer to our previous blog post for details on how to implement this for HIPAA.

Once it is confirmed a service is covered by certifications, i.e. AWS has handled their side of the shared responsibility model, the next step is to ensure the customer takes care of their side of the shared responsibility model. This is often done by performing a risk assessment and defining controls around how a service can be used. For example, data stored in an Amazon S3 bucket should be encrypted. This control can be implemented using AWS Config or a 3rd party tool like Cloud Custodian. These tools can check the configuration of a service and trigger action if the control objective is not met. In this example, if an Amazon S3 bucket is found unencrypted an alert could be raised or automated remediation taken like triggering an AWS Lambda function to encrypt the bucket.

Process and organization

Before we look at this automation of controls, a quick word about implementing this process. Many customers see this approval step as a potential blocker because it is a step in a sequential process. Developers go through their design phase, start building in an unconstrained development environment, then try and move to test only to find a service is not yet approved for use. Things then wait while another team goes through the approval process. Developers who have been hit by this before endeavor to start the process early only to find the people responsible for approvals can be overloaded with requests.

Instead of a sequential process, regulatory activities like service approval should be proactive and done outside the critical path, making them transparent to development teams. One way to do this is for a team, perhaps within a Cloud Center of Excellence (CCoE), to monitor the services being used within development accounts. Perhaps the team can reach directly into the accounts to analyze deployed resources or they have access to query central CloudTrail logs to see which services are in use. The resultant list of services in use is compared against an inventory of approved services and the delta added to a backlog for processing. Working collaboratively with the development teams the backlog can be prioritized. This enables the team to analyze and approve a service while early design and build activities are occurring and before the development team reaches more constrained test, validated, or production accounts thus removing this regulatory step as a blocker. This proactive approach also allows the team responsible for approvals to plan and prioritize their workload rather than reacting to a flood of requests.

Automated compliance controls

As described earlier, part of the process is the identification of compliance controls which should be monitored and action taken if deviations detected. There are a few ways in which we can implement these compliance controls.

One way is to use AWS Config which enables continuous monitoring of your AWS resources, making it simple to assess, audit, and record resource configurations and changes. AWS Config does this through the use of rules that define the desired configuration state of your AWS resources. AWS Config provides a number of AWS Managed Rules that address a wide range of security concerns such as checking if you encrypted your Amazon Elastic Block Store (Amazon EBS) volumes, tagged your resources appropriately, and enabled multi-factor authentication (MFA) for root accounts. You can also create custom rules to codify your compliance requirements through the use of AWS Lambda functions.

Continuing our example using Amazon S3, here’s a blog post to monitor and respond to an Amazon S3 bucket change allowing public access. Another option is to use Cloud Custodian, which can be integrated with AWS Security Hub. Cloud Custodian policies are written in simple YAML configuration files that enable users to specify policies on a resource type (Amazon EC2, Amazon S3, Amazon Redshift, Amazon Elastic Block Store (Amazon EBS), Amazon RDS etc.) and are constructed from a vocabulary of filters and actions. Reference this previous blog for additional details.

As part of approval, GxP controls can be identified to cover various aspects of Compliance requirements like Audit Trail, Electronic Record, Electronic Signature, Data Integrity, Data Encryption/Privacy, Backups etc. These GxP controls contribute towards the qualification of AWS services and each of these compliance requirements can be converted into Cloud Custodian Policies (as illustrated in the example below).

Here are the steps to execute the Cloud Custodian policy for an Amazon S3 bucket:

Step 1: Define Compliance Requirements
Define compliance requirements required to qualify an Amazon S3 bucket; for example, “Personal information and sensitive customer information must be encrypted whenever it is stored on portable media and devices”.

Step 2: Create Cloud Custodian policy
To achieve this requirement, we can write a Cloud Custodian Controls policy using YAML format as described below:

  - name: s3-bucket-encryption-compliance
    description: |
      Detect the unencrypted S3 Buckets and apply AES256 bit encryption 
      based on the event CreateBucket/DeleteBucketEncryption
    resource: s3
      type: cloudtrail
        - source:
          event: CreateBucket
          ids: "requestParameters.bucketName"
        - source:
          event: DeleteBucketEncryption
          ids: "requestParameters.bucketName"          
      - type: bucket-encryption
        state: False
      - type: set-bucket-encryption
        crypto: AES256
        enabled: true  

Step 3: Set Required AWS IAM Permissions
The action securityhub:BatchImportFindings is required. If this action is not already allowed by the credentials that will be used, either add them to an existing customer managed policy, or attach the new AWSSecurityHubFullAccess AWS managed policy. Note: IAM Users or Roles with either arn:aws:iam::aws:policy/AdministratorAccess or arn:aws:iam::aws:policy/PowerUserAccess already have the required permissions.

Step 4: Multi-Account Execution
The above policy can be deployed in multiple account using c7n-org tool, which is a tool to run Cloud Custodian against multiple AWS accounts in parallel.  Refer to this link for Multi Account Custodian Execution .

Step 5: View the Finding in the Security Hub Console
Similar kinds of policies can be built for other services. Refer to GitHub for such technical control policies for various AWS Services.


We have walked through a process that has resulted in AWS services reviewed and approved for use by the application development teams. We have also explored how to lower the compliance burden on development teams through effective process design so they can focus more on providing value and less on waiting for regulatory pre-approvals. Finally, we introduced automation to streamline the compliance checks for the cloud operations team. However, the approval of AWS services for general use is often just the first step in a formal qualification process to allow services to be used in GxP regulated workloads. For example, customers could leverage the building block concept as described in the ISPE GAMP Good Practice Guide: IT Infrastructure Control and Compliance where building blocks are defined and qualified. These building blocks could be a single AWS service, a combination of services or a complete stack. This is a topic for a later date.

Ian Sutcliffe

Ian Sutcliffe

Ian Sutcliffe is a Global Solution Architect with 25+ years of experience in IT, primarily in the Life Sciences industry. A thought leader in the area of regulated cloud computing, one of his areas of focus is IT operating model and process optimization and automation with the intent of helping customers become Regulated Cloud Natives.

Susant Mallick

Susant Mallick

Susant Mallick is an Industry specialist & digital evangelist in AWS’ Global Healthcare and Life-Sciences practice. He has over 20+ years of experience in the Life Science industry working with biopharmaceutical and medical device companies across North America, APAC and EMEA regions. He has built many Digital Health Platform and Patient Engagement solutions using Mobile App, AI/ML, IOT and other technologies for customers in various Therapeutic Areas. He holds a B.Tech degree in Electrical Engineering & MBA in Finance. His thought leadership and industry expertise earned many accolades in Pharma industry forums.