AWS Partner Network (APN) Blog

Tag: Data Tokenization

Protegrity-APN-Blog-121922

How Protegrity Helps Protect PII and PHI Data at Scale on Amazon S3 with AWS Lambda

With the ever-growing need for enterprise data to migrate to the cloud, and the necessity of keeping that data secure, organizations are searching for tools that enable migration while meeting regulatory requirements for data security and privacy. To meet these needs for customers, Protegrity has introduced new solutions leveraging Amazon S3, and these Cloud Protect for S3 products enable you to secure your sensitive data in S3 with Protegrity technology such as tokenization.

Data Tokenization with Amazon Athena and Protegrity

Data security has always been an important consideration for organizations when complying with data protection regulations. Protegrity, an AWS ISV Partner and global leader in data security, has released a serverless User Defined Function (UDF) that adds external data tokenization capabilities to the Amazon Athena platform. Learn how customers can use the Protection Athena Protector UDF to tokenize or detokenize data at scale.

Data Tokenization with Amazon Redshift and Protegrity

Many companies are using Amazon Redshift to analyze and transform their data. As data continues to grow and become even more important, they are looking for more ways to extract valuable insights. One use case we’re especially excited to support is that of data tokenization and masking. Amazon Redshift has collaborated with Protegrity, an AWS Advanced Technology Partner, to enable organizations with strict security requirements to protect their data while being able to obtain the powerful insights.