AWS Partner Network (APN) Blog

Category: Amazon Simple Storage Services (S3)

Splunk_AWS Solutions

How to Reduce AWS Storage Costs for Splunk Deployments Using SmartStore

It can be overwhelming for organizations to keep pace with the amount of data being generated by machines every day. There’s a great deal of meaningful information that can be extracted from data, but companies need software vendors to develop tools that help. In this post, learn about Splunk SmartStore and how it helps customers to reduce storage cost in a Splunk deployment on AWS. Many customers are using SmartStore to reduce the size of Amazon EBS volumes and moving data to Amazon S3.

Read More

Automatically Detect and Protect Sensitive Data in Amazon S3 Using Dataguise DgSecure

Dataguise has been working on finding and protecting sensitive data in a range of data stores over the last 10 years. DgSecure on AWS is Dataguise’s flagship product that can scan for sensitive elements in structured and unstructured data, and optionally mask or encrypt the data. Organizations can also monitor access to the sensitive data using DgSecure. In this post, we describe the steps to scan objects in Amazon S3 buckets and protect them with masking or encryption.

Read More
AWS Cloud Automation

Using Amazon CloudFront with Multi-Region Amazon S3 Origins

By leveraging services like Amazon S3 to host content, AWS Competency Partner Cloudar has a cost effective way to build websites that are highly available. If content is stored in a single Amazon S3 bucket, all of the content is stored in a single AWS region. To serve content from other regions, you need to route requests to different Amazon S3 buckets. In this post, explore how to accomplished this by using Amazon CloudFront as a content delivery network and Lambda@Edge as a router.

Read More
Batch

How to Migrate Mainframe Batch to Cloud Microservices with Blu Age and AWS

While modernizing customer mainframes, the team at Blu Age discovered that Batch can be a complex aspect of a mainframe migration to AWS. It’s critical to design your AWS architecture to account for the key Batch stringent performance requirements such as intensive I/Os, large datasets, and short durations. Let’s explore how to migrate mainframe Batch to AWS microservices using Blu Age automated transformation technology.

Read More
Storage_featured

Getting the Most Out of the Amazon S3 CLI

Amazon S3 makes it possible to store unlimited numbers of objects, each up to 5 TB in size. Managing resources at this scale requires quality tooling. When it comes time to upload many objects, a few large objects or a mix of both, you’ll want to find the right tool for the job. This post looks at one option that is sometimes overlooked: the AWS Command Line Interface (AWS CLI) for Amazon S3.

Read More