AWS Security Blog

AWS Announces Amazon Macie

I’m pleased to announce that today we’ve launched a new security service, Amazon Macie.

This service leverages machine learning to help customers prevent data loss by automatically discovering, classifying, and protecting sensitive data in AWS. Amazon Macie recognizes sensitive data such as personally identifiable information (PII) or intellectual property, providing customers with dashboards and alerts that give visibility into how data is being accessed or moved. This enables customers to apply machine learning to a wide array of security and compliance workloads, we think this will be a significant enabler for our customers.

To learn more about the see the full AWS Blog post.

–  Steve

 

How to Establish Federated Access to Your AWS Resources by Using Active Directory User Attributes

To govern federated access to your AWS resources, it’s a common practice to use Microsoft Active Directory (AD) groups. When using AD groups, establishing federation requires the number of AD groups to be equal to the number of your AWS accounts multiplied by the number of roles in each of your AWS accounts. As you can imagine, this can result in the creation of a very large number of AD groups to manage federated access.

However, some organizations have limits on how many AD groups they can create. For example, an organization might need to keep its AD group hierarchy reasonably flat and avoid a deep nesting of groups. Such a situation needs a solution that doesn’t require you to create exponentially more AD groups while still allowing you to use access control and automated user integration.

In this blog post, I provide step-by-step instructions for integrating AWS Identity and Access Management (IAM) with Microsoft Active Directory Federation Services (AD FS) by using AD user attributes, allowing you to establish federated access without expanding your total number of AD groups. Your organization’s enterprise administrator probably has existing processes in place for managing AD group memberships and provisioning, and you can extend these processes to the management of AD user attributes and the reduction of your organization’s dependency on AD groups. (more…)

AWS Encryption SDK: How to Decide if Data Key Caching Is Right for Your Application

AWS KMS image

Today, the AWS Crypto Tools team introduced a new feature in the AWS Encryption SDK: data key caching. Data key caching lets you reuse the data keys that protect your data, instead of generating a new data key for each encryption operation.

Data key caching can reduce latency, improve throughput, reduce cost, and help you stay within service limits as your application scales. In particular, caching might help if your application is hitting the AWS Key Management Service (KMS) requests-per-second limit and raising the limit does not solve the problem.

However, these benefits come with some security tradeoffs. Encryption best practices generally discourage extensive reuse of data keys.

In this blog post, I explore those tradeoffs and provide information that can help you decide whether data key caching is a good strategy for your application. I also explain how data key caching is implemented in the AWS Encryption SDK and describe the security thresholds that you can set to limit the reuse of data keys. Finally, I provide some practical examples of using the security thresholds to meet cost, performance, and security goals.

Introducing data key caching

The AWS Encryption SDK is a client-side encryption library that makes it easier for you to implement cryptography best practices in your application. It includes secure default behavior for developers who are not encryption experts, while being flexible enough to work for the most experienced users. (more…)

The First AWS Regional Financial Services Guide Focuses on Singapore

Financial Services image

To help Financial Services clients address Singapore’s regulations on financial institutions in a shared responsibility environment, AWS has published the AWS User Guide to Financial Services Regulations and Guidelines in Singapore. This first-ever AWS Financial Services guide is the culmination of the work AWS has done in the last year to help customers navigate the Monetary Authority of Singapore’s 2016 updated guidelines about cloud services.

This new guide examines Singaporean requirements and guidelines, providing information that will help you conduct due diligence on AWS with regard to IT security and risk management. The guide also shares leading practices to empower you to develop your own governance programs by using AWS.

The guide focuses on three top considerations for financial institutions operating in Singapore:

  • Outsourcing guidelines – Conduct a self-assessment of AWS services and align your governance requirements within a shared responsibility model.
  • Technology risk management – Take a deeper look at where shared responsibility exists for technology implementation and perform a self-assessment of AWS service responsibilities.
  • Cloud computing implementation – Assess additional responsibilities to ensure security and compliance with local guidelines.

We will release additional AWS Financial Services resource guides this year to help you understand the requirements in other markets around the globe. These guides will be posted on the AWS Compliance Resources page.

If you have questions or comments about this new guide, submit them in the “Comments” section below.

– Jodi

Announcing the New AWS Customer Compliance Center

AWS has the longest running, most effective, and most customer-obsessed compliance program in the cloud market. We have always centered our program around customers, obtaining the certifications needed to provide our customers with the proper level of validated transparency in order to enable them to certify their own AWS workloads [download .pdf of AWS certifications]. We also offer a rich suite of embedded compliance tooling, enabling customers and partners to more effectively manage security controls and in turn provide evidence of effective control operation to their auditors. Along with our customers and partners, we have the largest, most diverse, and most comprehensive compliance footprint in the industry.

Enabling customers is a core part of the AWS DNA. Today, in the spirit of that pedigree, I’m happy to announce we’ve launched a new AWS Customer Compliance Center. This center is focused on the security and compliance of our customers on AWS. You can learn from other customer experiences and discover how your peers have solved the compliance, governance, and audit challenges present in today’s regulatory environment. You can also access our industry-first cloud Auditor Learning Path via the customer center. These online university learning resources are logical learning paths, specifically designed for security, compliance and audit professionals, allowing you to build on the IT skills you have to move your environment to the next generation of audit and security assurance. As we engage with our security and compliance customer colleagues on this topic, we will continue to update and improve upon the existing resource and publish new enablers in the coming months.

We are excited to continue to work with our customers on moving from the old-guard manual audit world to the new cloud-enabled, automated, “secure and compliant by default” model we’ve been leading over the past few years.

– Chad Woolf, AWS Security & Compliance

Newly Updated: Example AWS IAM Policies for You to Use and Customize

To help you grant access to specific resources and conditions, the Example Policies page in the AWS Identity and Access Management (IAM) documentation now includes more than thirty policies for you to use or customize to meet your permissions requirements. The AWS Support team developed these policies from their experiences working with AWS customers over the years. The example policies cover common permissions use cases you might encounter across services such as Amazon DynamoDB, Amazon EC2, AWS Elastic Beanstalk, Amazon RDS, Amazon S3, and IAM.

In this blog post, I introduce the updated Example Policies page and explain how to use and customize these policies for your needs.

The new Example Policies page

The Example Policies page in the IAM User Guide now provides an overview of the example policies and includes a link to view each policy on a separate page. Note that each of these policies has been reviewed and approved by AWS Support. If you would like to submit a policy that you have found to be particularly useful, post it on the IAM forum. (more…)

How to Monitor and Visualize Failed SSH Access Attempts to Amazon EC2 Linux Instances

As part of the AWS Shared Responsibility Model, you are responsible for monitoring and managing your resources at the operating system and application level. When you monitor your application servers, for example, you can measure, visualize, react to, and improve the security of those servers. You probably already do this on premises or in other environments, and you can adapt your existing processes, tools, and methodologies for use in the AWS Cloud. For more details about best practices for monitoring your AWS resources, see the “Manage Security Monitoring, Alerting, Audit Trail, and Incident Response” section in the AWS Security Best Practices whitepaper.

This blog post focuses on how to log and create alarms on invalid Secure Shell (SSH) access attempts. Implementing live monitoring and session recording facilitates the identification of unauthorized activity and can help confirm that remote users access only those systems they are authorized to use. With SSH log information in hand (such as invalid access type, bad private keys, and remote IP addresses), you can take proactive actions to protect your servers. For example, you can use an AWS Lambda function to adjust your server’s security rules when an alarm is triggered that indicates an invalid SSH access attempt.

In this post, I demonstrate how to use Amazon CloudWatch Logs to monitor SSH access to your application servers (Amazon EC2 Linux instances) so that you can monitor rejected SSH connection requests and take action. I also show how to configure CloudWatch Logs to send SSH access logs from application servers that reside in a public subnet. Last, I demonstrate how to visualize how many attempts are made to SSH into your application servers with bad private keys and invalid user names. Using these techniques and tools can help you improve the security of your application servers.

(more…)

How to Use AWS Organizations to Automate End-to-End Account Creation

AWS Organizations offers new capabilities for managing AWS accounts, including automated account creation via the Organizations API. For example, you can bring new development teams onboard by using the Organizations API to create an account, AWS CloudFormation templates to configure the account (such as for AWS Identity and Access Management [IAM] and networking), and service control policies (SCPs) to help enforce corporate policies.

In this blog post, I demonstrate the step-by-step process for end-to-end account creation in Organizations as well as how to automate the entire process. I also show how to move a new account into an organizational unit (OU).

Process overview

The following process flow diagram illustrates the steps required to create an account, configure the account, and then move it into an OU so that the account can take advantage of the centralized SCP functionality in Organizations. The tasks in the blue nodes occur in the master account in the organization in question, and the task in the orange node occurs in the new member account I create. In this post, I provide a script (in both Bash/CLI and Python) that you can use to automate this account creation process, and I walk through each step shown in the diagram to explain the process in detail. For the purposes of this post, I use the AWS CLI in combination with CloudFormation to create and configure an account. (more…)

How to Use Batch References in Amazon Cloud Directory to Refer to New Objects in a Batch Request

In Amazon Cloud Directory, it’s often necessary to add new objects or add relationships between new objects and existing objects to reflect changes in a real-world hierarchy. With Cloud Directory, you can make these changes efficiently by using batch references within batch operations.

Let’s say I want to take an existing child object in a hierarchy, detach it from its parent, and reattach it to another part of the hierarchy. A simple way to do this would be to make a call to get the object’s unique identifier, another call to detach the object from its parent using the unique identifier, and a third call to attach it to a new parent. However, if I use batch references within a batch write operation, I can perform all three of these actions in the same request, greatly simplifying my code and reducing the round trips required to make such changes.

In this post, I demonstrate how to use batch references in a single write request to simplify adding and restructuring a Cloud Directory hierarchy. I have used the AWS SDK for Java for all the sample code in this post, but you can use other language SDKs or the AWS CLI in a similar way.

Using batch references

In my previous post, I demonstrated how to add AnyCompany’s North American warehouses to a global network of warehouses. As time passes and demand grows, AnyCompany launches multiple warehouses in North American cities to fulfill customer orders with continued efficiency. This requires the company to restructure the network to group warehouses in the same region so that the company can apply similar standards to them, such as delivery times, delivery areas, and types of products sold. (more…)

Write and Read Multiple Objects in Amazon Cloud Directory by Using Batch Operations

Amazon Cloud Directory is a hierarchical data store that enables you to build flexible, cloud-native directories for organizing hierarchies of data along multiple dimensions. For example, you can create an organizational structure that you can navigate through multiple hierarchies for reporting structure, location, and cost center.

In this blog post, I demonstrate how you can use Cloud Directory APIs to write and read multiple objects by using batch operations. With batch write operations, you can execute a sequence of operations atomically—meaning that all of the write operations must occur, or none of them do. You also can make your application efficient by reducing the number of required round trips to read and write objects to your directory. I have used the AWS SDK for Java for all the sample code in this blog post, but you can use other language SDKs or the AWS CLI in a similar way.

Using batch write operations

To demonstrate batch write operations, let’s say that AnyCompany’s warehouses are organized to determine the fastest methods to ship orders to its customers. In North America, AnyCompany plans to open new warehouses regularly so that the company can keep up with customer demand while continuing to meet the delivery times to which they are committed.

The following diagram shows part of AnyCompany’s global network, including Asian and European warehouse networks. (more…)