AWS Partner Network (APN) Blog

Category: Amazon Simple Storage Services (S3)

Zaloni_AWS Solutions

Turning Data into a Key Enterprise Asset with a Governed Data Lake on AWS

Data and analytics success relies on providing analysts and data end users with quick, easy access to accurate, quality data. Enterprises need a high performing and cost-efficient data architecture that supports demand for data access, while providing the data governance and management capabilities required by IT. Data management excellence, which is best achieved via a data lake on AWS, captures and makes quality data available to analysts in a fast and cost-effective way.

Read More
MongoDB_AWS Solutions

MongoDB Atlas Data Lake Lets Developers Create Value from Rich Modern Data 

With the proliferation of cost-effective storage options such as Amazon S3, there should be no reason you can’t keep your data forever, except that with this much data it can be difficult to create value in a timely and efficient way. MongoDB’s Atlas Data Lake enables developers to mine their data for insights with more storage options and the speed and agility of the AWS Cloud. It provides a serverless parallelized compute platform that gives you a powerful and flexible way to analyze and explore your data on Amazon S3.

Read More

How to Create a Continually Refreshed Amazon S3 Data Lake in Just One Day

Data management architectures have evolved drastically from the traditional data warehousing model, to today’s more flexible systems that use pay-as-you-go cloud computing models for big data workloads. Learn how AWS services like Amazon EMR can be used with Bryte Systems to deploy an Amazon S3 data lake in one day. We’ll also detail how AWS and the BryteFlow solution can automate modern data architecture to significantly accelerate delivery and business insights at scale.

Read More

How to Enable Mainframe Data Analytics on AWS Using Model9

Mainframe proprietary storage solutions such as VTLs hold valuable data locked in a platform with complex tools. This can lead to higher compute and storage costs, and make it harder to retain existing employees or train new ones. When mainframe data is stored in a cloud storage service, however, it can be accessed by a rich ecosystem of applications and analytics tools. Model9 enables mainframe customers to backup and archive directly to AWS.

Read More
Splunk_AWS Solutions

How to Reduce AWS Storage Costs for Splunk Deployments Using SmartStore

It can be overwhelming for organizations to keep pace with the amount of data being generated by machines every day. There’s a great deal of meaningful information that can be extracted from data, but companies need software vendors to develop tools that help. In this post, learn about Splunk SmartStore and how it helps customers to reduce storage cost in a Splunk deployment on AWS. Many customers are using SmartStore to reduce the size of Amazon EBS volumes and moving data to Amazon S3.

Read More

Automatically Detect and Protect Sensitive Data in Amazon S3 Using Dataguise DgSecure

Dataguise has been working on finding and protecting sensitive data in a range of data stores over the last 10 years. DgSecure on AWS is Dataguise’s flagship product that can scan for sensitive elements in structured and unstructured data, and optionally mask or encrypt the data. Organizations can also monitor access to the sensitive data using DgSecure. In this post, we describe the steps to scan objects in Amazon S3 buckets and protect them with masking or encryption.

Read More
AWS Cloud Automation

Using Amazon CloudFront with Multi-Region Amazon S3 Origins

By leveraging services like Amazon S3 to host content, AWS Competency Partner Cloudar has a cost effective way to build websites that are highly available. If content is stored in a single Amazon S3 bucket, all of the content is stored in a single AWS region. To serve content from other regions, you need to route requests to different Amazon S3 buckets. In this post, explore how to accomplished this by using Amazon CloudFront as a content delivery network and Lambda@Edge as a router.

Read More
Batch

How to Migrate Mainframe Batch to Cloud Microservices with Blu Age and AWS

While modernizing customer mainframes, the team at Blu Age discovered that Batch can be a complex aspect of a mainframe migration to AWS. It’s critical to design your AWS architecture to account for the key Batch stringent performance requirements such as intensive I/Os, large datasets, and short durations. Let’s explore how to migrate mainframe Batch to AWS microservices using Blu Age automated transformation technology.

Read More
Storage_featured

Getting the Most Out of the Amazon S3 CLI

Amazon S3 makes it possible to store unlimited numbers of objects, each up to 5 TB in size. Managing resources at this scale requires quality tooling. When it comes time to upload many objects, a few large objects or a mix of both, you’ll want to find the right tool for the job. This post looks at one option that is sometimes overlooked: the AWS Command Line Interface (AWS CLI) for Amazon S3.

Read More