AWS Partner Network (APN) Blog
Introduction by Terry Wise, Global Vice President, Channels & Alliances at AWS
What can your data do for you? More importantly, how can insights derived from your data help you drive additional value for end customers?
Our APN partners offer services and solutions that complement what AWS has to offer. As an example, many customers are choosing to build a data lake on AWS. NorthBay is a Big Data Competency Consulting partner that helped architect and implement a data lake on AWS for Eliza Corporation. You can read details of the solution they built here. Today, I want to tell you a bit about four of our AWS Big Data Competency ISVs and what makes them unique: Alteryx, Databricks, SnapLogic, and Treasure Data.
AWS Big Data Competency Holder in Data Integration
How is your time spent when you embark on a new data analytics project? For many, the time required to gather, prepare, and process their data cuts into the time they can spend actually analyzing and learning from their data. Alteryx’s mission is to change the game for these analysts through the company’s self-service data analytics platform. “Alteryx Analytics provides analysts the unique ability to easily prep, blend, and analyze all of their data using a repeatable workflow, then deploy and share analytics at scale for deeper insights in hours, not weeks. Analysts love the Alteryx Analytics platform because they can connect to and cleanse data from data warehouses, cloud applications, spreadsheets, and other sources, easily join this data together, then perform analytics – predictive, statistical, and spatial – using the same intuitive user interface, without writing any code,” says Bob Laurent, VP of product marketing at Alteryx. The company’s products are used by a number of AWS customers, including Chick-fil-A, Marketo, and The National Trust.
Alteryx integrates with Amazon Redshift and provides support for Amazon Aurora and Amazon S3. Using Alteryx on AWS, users can blend data stored in the AWS Cloud, such as data stored in Redshift, with data from other sources using Alteryx’s advanced analytic workflow. Earlier this year, the company virtualized its Alteryx Server platform to make it easy for users to deploy on AWS through the AWS Marketplace. “Organizations can deploy our Alteryx Server platform in the AWS Cloud within minutes, while maintaining the enterprise-class security and scalability of our popular on-premises solution. This gives organizations a choice for how they want to quickly share critical business insights with others in their organization,” explains Laurent.
See Alteryx in action by downloading a free 14-day trial of Alteryx Designer here, or launch Alteryx Server from the AWS Marketplace here. If you’re interested in becoming an Alteryx Partner, click here. To learn more about Alteryx, visit the company’s AWS-dedicated site.
AWS Big Data Competency Holder in Advanced Analytics
Are you looking for an efficient way to run Apache® Spark™ as you seek to create value from your data and build a sophisticated analytics solution on AWS? Then take a look at Databricks, founded by the team who created the Apache Spark project. “Databricks provides a just-in-time data platform, to simplify data integration, real-time experimentation, and robust deployment of production applications,” says John Tripier, Senior Director, Business Development at Databricks. The company’s mission is to help users of all types within an organization, from data scientists to data engineers to architects to business analysts, harness and maximize the power of Spark. Users can also take advantage of a wide range of BI tools and systems that integrate with the platform, including Tableau, Looker, and Alteryx. The company works with companies across a wide range of industries, including Capital One, 3M, NBC Universal, Edmunds.com, Viacom, and LendUp.
Databricks is hosted on AWS, and takes advantage of Amazon EC2 and Amazon S3. “Databricks is a cloud-native platform that deploys Spark clusters within the AWS accounts of our 500+ customers. We leverage the compute, storage, and security resources offered by AWS. We find AWS is a reliable and secure environment and enables fast implementation of infrastructure in regions all over the world,” says Tripier.
Want to give Databricks a spin? The company offers a free trial of their software here. Learn more about the Databricks platform here. And if you’re a Consulting Partner interested in learning more about becoming a Databricks Partner, click here. Databricks deploys in all regions, including AWS GovCloud, and is also an AWS Public Sector Partner.
AWS Big Data Competency Holder in Data Integration
Where does your data come from? For most companies, particularly enterprises, the answer is, a lot of places. SnapLogic is focused on helping enterprises easily connect applications, data, and things between on-premises, cloud, and hybrid environments through its Enterprise Integration Cloud (EIC). True to its name, the company provides Snaps, which are modular collections of integration components built for a specific data source, business application, or technology. “We help customers automate business processes, accelerate analytics, and drive digital transformation,” says Ray Hines, director of strategic partners and ISVs at SnapLogic. The company works with hundreds of customers, including Adobe, Box, and Earth Networks.
The SnapLogic Enterprise Integration Cloud integrates with Amazon Redshift, Amazon DynamoDB, and Amazon RDS. “We provided pre-built integrations with these services because our customers are rapidly adopting them for their cloud data warehousing needs,” explains Hines. The company’s solution can help simplify the onboarding process for Redshift, DynamoDB, and RDS customers. For instance, Snap Patterns provide pre-built data integrations for common use cases and a number of other features (learn more here).
Care to try out SnapLogic for AWS? Click here for a 30-day free trial of SnapLogic Integration Cloud for Redshift or download the data sheet here. You can request a custom demo here. Consulting Partners, learn more about becoming a SnapLogic Partner here.
AWS Big Data Competency Holder in Data Management
Are you a marketer looking for the ability to use data to provide great experiences to end customers? Are you in sales operations and are you looking to create a centralized dashboard for real-time sales data? Give life to your customer data through Treasure Data. “Treasure Data simplifies data management. Our Live Customer Data platform keeps data connected, current, and easily accessible to the people and algorithms that drive business success,” says Stephen Lee, vice president of business development at Treasure Data. “We provide a turnkey solution that collects data from 300+ sources, stores the data at scale, and provides users the tools to analyze and activate their data in their application of choice.” The company works with customers across industries including Grindr, Warner Brothers, and Dentsu.
“We deployed our solution on AWS because of the scalability, reliability, and global footprint of the AWS Cloud and the ability to deploy without having capital expenditures. With AWS, we can easily deploy our solution in new regions. We’ve also found there to be a strong support ecosystem,” says Lee. Treasure Data’s Live Customer Data Platform integrates with Amazon Redshift, Amazon S3, and Amazon Kinesis, along with many other solutions including Tableau, Chartio, Qlik, Looker, and Heroku (see all integrations and learn some nifty integration recipes). Getting started with Treasure Data is easy. “Our Solution Architects work with our new customers to get their initial data sources set up, after which our customers can be up and running in minutes,” explains Lee.
You can request a custom demo here, or simply email the team directly at firstname.lastname@example.org. Consulting Partners interested in becoming a Treasure Data partner can visit the company’s partner page here.
This blog is intended for educational purposes and is not an endorsement of the third-party products. Please contact the firms for details regarding performance and functionality.
This is a guest post by Kelly Boeckman. Kelly is a Partner Manager at AWS who is focused on the Storage segment.
Benjamin Franklin once famously said, “In this world nothing can be said to be certain, except death and taxes.” It’s worth noting, however, that Ben was living in the late 1700s, and was unable to conceive of a third certainty: data growth.
Data deduplication is a critical solution for runaway data growth, as it can reduce the amount of data that needs to be stored or backed up. It also shortens backup windows, dramatically lowers infrastructure costs, and enables cost– and space-efficient data clones for testing, QA, etc. This unlocks a door to the Shangri-La of “Do-More-With-Less.”
So you’ve got your data safely deduped and stored in Amazon Simple Storage Service (Amazon S3), but surely that data can provide more value than just sitting there? Other teams across your organization could undoubtedly derive value from copies of that data. Maybe your developers could create new features or QA teams could more accurately test against your code base if they can test against a whole set of your cloud data.
Today, we want to tell you about an innovative solution in this space from APN Technology Partner StorReduce that can help you do more with your data for less. StorReduce has announced support for cloning on object storage such as S3 and Standard-Infrequent Access (Standard – IA). This support allows users and administrators to make copy-on-write copies of objects stored with StorReduce in a rapid, space-efficient manner at petabyte scale.
Who is StorReduce?
StorReduce can help enterprises storing unstructured data to S3 or Amazon Glacier on AWS to reduce the amount and cost of storage significantly. It also offers enterprises another way to migrate backup appliance data and large tape archives to AWS.
StorReduce’s scale-out deduplication software runs on the cloud or in a data center and scales to petabytes of data. The variable block length deduplication removes any redundant blocks of data before it is stored so that only one copy of each block is stored. StorReduce provides throughput of more than 1 gigabyte per second per server for both reads and writes. A StorReduce cluster can scale to tens or hundreds of servers and provide throughputs of 10s to 100s of gigabytes per second. StorReduce is suitable to deduplicate most data workloads, particularly backup, archive, Hadoop cluster backups, and general unstructured file data. Because StorReduce has read-only endpoints and a cloud-native interface, data that it migrates to the AWS Cloud can be reused with cloud services.
Object storage cloning
Storage volume cloning is a long-standing feature of most on-premises NAS and SAN devices—a well-known example is NetApp FlexClone. In these environments, cloning enables fast snapshots of data, creating many additional copies of the data. Depending on the data change rate, deduplication can dramatically reduce the footprint of those extra copies, freeing up storage resources.
StorReduce’s Object Clone brings this feature to S3 and Standard – IA. Leveraging deduplication, a clone can be created simply by copying the references to the unique blocks that have been stored with StorReduce, while leaving the data stored in S3 unchanged.
Copying large amounts of object data may take hours per copy, and – lacking data reduction – you pay for every additional copy, regardless of how much of it is unique.
With StorReduce Object Clone, you need only copy a small amount of metadata. Copying petabyte-scale buckets becomes nearly instantaneous, and multiple clones can be created in rapid succession. Additionally, secondary and subsequent copies of the data are low cost, even at petabyte scale.
How StorReduce Object Clone works
Using the StorReduce dashboard to perform a “Clone Bucket” request is as easy as creating a bucket in StorReduce or S3. Simply navigate to the source bucket to clone. Choose Clone Bucket from the Actions dropdown and specify the target bucket name and StorReduce handles the rest. It creates a duplicate bucket that is owned by the initiator of the request, and which inherits the source bucket’s data.
Clones can also be created programmatically with two API calls, and you can always trace the name of the source bucket. If the source is deleted, this does not affect the operation, the contents of the clone, or access to it.
StorReduce Object Clone provides a variety of benefits, including:
- Increased flexibility – Provide cloud-based object storage with the same level of protection available to on-premises files or block data
- Clone entire datasets at petabyte scale as many times as needed, quickly and affordably
- Enable efficient, isolated application development and testing plus research and development against full datasets
- Simplify the cloning of data in object stores, regardless of data deduplication ratios
- Reduced management complexity – Administrators can quickly provide clones and reduce human error
- Subsets of data can be made easily administrable for unbiased application testing
- Protect against human error or malicious deletion of data in object stores
- Make time-based clones of entire buckets and assign read-only access for “locking in” protected data at critical points in time
- Reduced expense – Space-efficient clones consume less space and require less infrastructure
- Enables you to meet your compliance requirements quickly where R&D or development activities require separate pools of data
- Clone only small amounts of data in index or metadata, which reduces the clone size
- Smaller clone size reduces cost of required infrastructure to store data for the second and subsequent copies, even at petabyte scale
StorReduce Object Clone provides anyone who is working with a large dataset in S3 (whether that data deduplicates or not) the ability to try experiments and to fail and iterate quickly. Common use cases include:
- Big data, Internet of Things, research – Clone petabyte-scale datasets so that researchers can work independently of each other in their own scratch areas.
- IT operations – Test new versions of software against your dataset in isolation.
- Developers, software testers – Clone your test datasets so that developers and testers can work with the whole dataset in insolation, not just a small subset. Roll the state back after failures and retest rapidly.
- Software quality assurance – Take a lightweight clone of an entire system at the point that it fails a test and hand it back to the developer.
- Hadoop, big data – Take point-in-time snapshots of the state of your cluster and roll back after problems.
To learn more about how AWS can help with your storage and backup needs, see the Storage and Backup details page.
This blog is not an endorsement of a third-party solution. It is intended for informational purposes.
Come meet AWS team members and fellow APN Partners at the third-annual AWS India Partner Summit, hosted in Mumbai. The event is free and it’s exclusive to APN Partners.
The event begins with breakfast, registration, and networking opportunities at 10 am. The keynote kicks off at 11 am and features Terry Wise, Vice President, Global Alliances, Ecosystem and Channels, AWS. Following the keynote are a number of sessions running until 4:15 pm. The day closes with APN excellence awards and another opportunity to network with your industry peers and with AWS team members.
Click here to learn more. We can’t wait to see you there!
By Carmen Puccio and Mandus Momberg. Carmen and Mandus are AWS Partner Solutions Architects focused on Migration.
It’s no secret that migrating software and services from an on-premises environment to the cloud entails unique considerations and requirements. To provide confidence in the outcome of your migration, your migration strategy needs to scale easily. This means that a large part of your workflow must be automated. There is no shortage of documentation on why automation in the cloud is important. In this post, we will show you how to perform an automated migration utilizing AWS Advanced Technology Partner CloudEndure, with a focus on incorporating automated tests so you can be confident that your application is working as expected post-migration.
The migration of a workload from on-premises to AWS requires careful planning and precise execution. There are many different strategies for moving to the cloud, and there are also numerous tools that help facilitate migration. All migration tools share common goals: to facilitate a migration to AWS by minimizing downtime and application workload impact, and to ensure that data loss is minimized.
Customers who want to quickly move their workloads to the cloud typically follow the rehost method, i.e. lift and shift. One of the challenges when executing a rehost is the amount of time it takes to manually confirm that a migrated application is performing as expected. Migrations that incorporate automation and rapid testing pipelines to validate proper migration are not only more likely to succeed but also improve efficiency as you take advantage of repeatable processes and decrease manual verification times. (more…)
By Aaron Friedman and Dario Rivera. Aaron is a Healthcare and Life Sciences Partner Solutions Architect with AWS. Dario is a Healthcare and Life Sciences Solutions Architect with AWS.
Changing lives for the better through technology is one of the most compelling reasons why both of us decided to focus our careers on merging software with healthcare and life sciences at AWS. Being able to work with AWS Partner Network (APN) partners and customers who build healthcare solutions that will affect the everyday lives of people is a task that is both equal parts challenging and rewarding. And today, we want to tell you about a new challenge that lets you to take the next step toward positively impacting human health as well: The Alexa Diabetes Challenge, sponsored by Merck & Co., Inc., Kenilworth, New Jersey, U.S.A., an APN Advanced Technology Partner.
If you’re reading this post, then there is a high probability that you know someone who has been affected by type 2 diabetes. According to the International Diabetes Federation, an estimated 415 million people around the world live with diabetes; in America alone, approximately 29.1 million Americans live with this disease. According to the Centers for Disease Control and Prevention, 1.4 million are diagnosed with type 2 diabetes every year in the US. Learning to manage this complex disease can be an overwhelming and isolating experience, especially for those newly diagnosed.
The Alexa Diabetes Challenge, sponsored by Merck & Co., calls on innovators to experiment with creating holistic voice-enabled solutions that are designed to improve the lives of those with type 2 diabetes. This pioneering challenge presents innovators with a unique opportunity to help navigate uncharted territory: experimenting with and developing new solutions for those managing a chronic condition.
Voice-powered experiences, such as those developers build for Amazon Alexa or Amazon Lex, can dramatically improve user experience by giving you the ability to interact with people at a more personal level. Communicating through voice, rather than just on a screen, has the potential to make it easier to empower patients to change habits and improve their overall well-being. We encourage entrants to explore a variety of voice experiences, ranging from Alexa skills to solutions built using the ecosystem of supporting AWS services, such as Amazon Lex, Amazon Polly, and AWS IoT, and other third-party technologies to develop patient-centric solutions that consider the possible impact on the entire healthcare experience.
This challenge will allow innovators like yourself to draw on both Merck’s expertise in drug development, epidemiology, medication adherence, observational research, and patient education programs and Amazon’s expertise in connected devices and voice interfaces.
Five finalists will be chosen to receive $25,000 each and advance to the Virtual Accelerator. At this stage, the finalists will receive expert mentorship as they develop their concepts into working prototypes. In addition, we will also host an in-person boot camp for the finalists at Amazon’s Seattle headquarters. At the end of the Virtual Accelerator, finalists will present their prototypes in-person to the judges at Demo Day at our AWS Pop-up Loft in NYC. Following Demo Day the judges will select a grand prize winner to receive $125,000.
How you can take part
Technology is changing the landscape of healthcare, and we’re very excited to work with Merck to advance the care of those with diabetes. We hope you enter the Alexa Diabetes Challenge, sponsored by Merck & Co., and use your creativity and expertise to bring ideas forward that stand to have a great impact on diabetes care for years to come.
Visit the Alexa Diabetes Challenge website to learn more and submit a concept. All submissions for this pioneering work in voice technology are due by May 22, 2017. We’re looking forward to seeing who gets chosen as a finalist and introduces a new voice in diabetes management!
The Alexa Diabetes Challenge is powered by Luminary Labs.
The AWS Service Delivery Program launched in November 2016 with one simple goal at heart: to help customers easily identify APN Partners with a proven track record of success delivering specific AWS services and who have demonstrated the ability to provide expertise in a specific service or skill area. Looking for help migrating a database to Amazon Aurora? Simply visit the Partner Solutions Finder, filter by Product, and you’ll find all of the APN Partners who’ve been validated by the AWS Service Delivery Program as holding expertise in Aurora.
A number of AWS services are included in the program, including many database services, compute services, content delivery services, security services, serverless computing services, and analytics services. The program also highlights APN Partners who deliver workloads in the AWS GovCloud (US) Region. Today, we’re excited to announce the addition of AWS Service Catalog to the program.
AWS Service Catalog + AWS Service Delivery Program = customer success
AWS Service Catalog allows organizations to create and manage catalogs of IT services that are approved for use on AWS. These IT services can include everything from virtual machine images, servers, software, and databases to complete multi-tier application architectures. With the addition of Service Catalog to the Service Delivery Program, customers can easily identify and connect with qualified AWS Consulting Partners who’ve demonstrated expertise with Service Catalog. “AWS Service Catalog is a fantastic tool for companies that want to enable developers to easily launch new AWS environments, but also need to control how environments are launched and configured,” says Stephanie Tayengco, CIO of Logicworks, a Premier APN Consulting Partner and AWS Service Catalog Delivery Partner. “Logicworks helps customers build the AWS CloudFormation stack for the products in their Service Catalog, applying the right constraints and controls so their developers can take advantage of a self-service, fully-automated, and secured environment.”
Why should APN Consulting Partners with expertise in Service Catalog join?
Joining the program provides you the ability to promote your firm as an AWS-validated expert in Service Catalog. By becoming an AWS Service Catalog Service Delivery Partner, you can increase your firm’s visibility to customers seeking your type of expertise through a number of channels, such as the AWS Service Delivery website. Additionally, you’ll be distinguished as an expert in Service Catalog in the Partner Solutions Finder and will be featured on the AWS Marketplace website. “Wipro is excited to achieve the AWS Service Delivery Partner status for the AWS Service Catalog service. With the help of AWS Service Catalog, customers can centrally manage commonly deployed IT services to improve service re-use and enhance developer productivity. Additionally, the central management of the IT Service Lifecycle can help save time and dollars which can be funneled to drive the cloud innovation agenda for digital enterprises,” says Varun Dube, General Manager – AWS practice, Wipro Ltd, a Premier APN Consulting Partner and AWS Service Catalog Delivery Partner.
What are the requirements?
In addition to meeting the minimum requirements of the program, which you can find here, your firm must pass service-specific verification of customer references and a technical review, to ensure that customers can be confident they are working with partners that provide recent and relevant experience.
Want to learn more?
Click here to learn more about the partners participating in the program, and visit the Service Delivery Program homepage to learn more about the broader program. If you would like to apply for membership, contact email@example.com.
Join APN launch partners Wipro, Flux7, Cloudticity, Logicworks, and BizCloud Experts and apply to be a part of the AWS Service Catalog Service Delivery program.
Since the inception of AWS Marketplace, we’ve prioritized customer and seller feedback as our starting point for driving Marketplace improvements. In response to customers and sellers pointing to software as a service (SaaS) as their preferred software delivery mechanism, we launched AWS Marketplace SaaS Subscriptions in November 2016. SaaS Subscriptions enables sellers to offer their SaaS solutions directly to AWS customers, with all charges consolidated on the customer’s bill alongside other services bought directly from AWS or through AWS Marketplace.
Our goal is to continue to enable SaaS sellers and drive additional value for customers. Today, we’re excited to announce the launch of the AWS Marketplace SaaS Contracts, which allows sellers to offer monthly, one, two, and three year contracts for SaaS and application programming interface (API) products.
What’s the benefit of SaaS Contracts for customers?
This new capability gives AWS customers more options and flexibility in how they procure software through AWS Marketplace. Customers can use a shopping-cart like experience to determine the number of included units and the duration of their contract. Customers can take advantage of potential cost savings resulting from longer-term contracts and will have the ability to expand their subscriptions at any time. Customers that purchase monthly contracts can expand to a one, two, or three year contract term as needed, and can take advantage of automatic, configurable renewals. Today, customers can now subscribe to over 70 SaaS products, giving AWS customers even greater selection.
How does this impact sellers?
SaaS Contracts provides sellers even more options for monetizing their solutions to AWS customers. In addition to the pay-as-you-go options provided by SaaS Subscriptions, sellers can now provide services that require up-front payment or offer discounts for committed usage amounts. Sellers can offer customers a monthly option – good for customers that want to test software before making a longer commitment. From here, a buyer can easily upgrade to a one, two, or three year contract term. With simple auto-renewal options for customers, it’s easier to manage your ongoing relationship with customers. After creating a new contract, Marketplace buyers are passed to the seller’s website, along with an encrypted token containing their customer identifier and product code. This experience is identical to the registration process for AWS Marketplace SaaS Subscriptions. Sellers use the customer identifier to check the customer’s entitlement by calling the AWS Marketplace entitlement service at any time, meaning sellers can rely on AWS Marketplace to serve as their primary entitlement store.
How do I get started as a seller?
We’ve made it simple for you to deliver your solution as a SaaS offering through AWS Marketplace. Once you have established your AWS Marketplace Seller account, you’ll need to select your billing dimension. You can choose from the existing options (users, hosts, data, units, tiers, bandwidth or requests) or request an additional dimension. You can also define multiple price points (called variants) within this dimension (for example, admin, power, and read-only users within the user category). To get started with your listing, log into the AWS Marketplace Management Portal and navigate to the “Listings” tab. To create a new SaaS listing, download and fill out the product load form for SaaS Contracts. Define your category, variants, pricing, and other listing data and submit it to AWS Marketplace once you are ready. You will receive a limited, preview version of your listing to test against before the listing is published live to the AWS Marketplace website.
Next, you’ll need to modify your registration page to receive the token containing the customer identifier and product code. You’ll also have to modify your application to call the new AWS Marketplace Entitlement Service to check the size and duration of your customer’s contract. You can download the AWS software development kit (SDK) that will help you format your metering records in any of the AWS supported languages. As a final step, you can choose to listen to notifications on a Simple Notification Service (SNS) topic for when your customers modify their contract. You can find more information about the steps necessary to modify your application in the AWS Marketplace SaaS Seller Integration guide, or reach out to your AWS Category Manager to connect with a solutions architect to help you with the process.
How do I learn more?
At launch, AWS Marketplace SaaS Contracts features products from 20 sellers: Alert Logic, AppDynamics, Box, Cloudberry Labs, CloudHealth, Davra Networks, Device Authority, Flowroute, Informatica, Lucidchart, Mnubo, NetApp, Pitney Bowes, Simularity, Splunk, SumoLogic, ThingLogix, Threat Stack, Trend Micro, and TSOLogic. Over the next few months, we expect more than a dozen additional sellers to release products. Visit here to see all the SaaS products available on AWS Marketplace.
To learn more about selling your product as a SaaS solution, or how to modify your product to become a SaaS solution, be sure to visit https://aws.amazon.com/marketplace/management/tour/.
Are you an APN Partner focused on helping government, education, or nonprofit customers around the world? Or are you just getting started on AWS and looking to learn more about AWS in the public sector?
Join us for the AWS Public Sector Summit, taking place June 12 – 14 in Washington, DC.
This three-day event features over 100 breakout sessions on a wide range of topics, two keynotes with a lineup of global speakers, a pre-day (June 12) with bootcamps and deep-dive sessions, and numerous opportunities to network with AWS customers, AWS staff, and fellow APN Partners.
Save your spot today and get ready to spend three action-packed days with the innovators who are changing the world with cloud computing. Learn more on the AWS Government, Education, & Nonprofits Blog.
Learn about Sponsorship Opportunities – Click Here
Interested in learning more about becoming an AWS Public Sector Partner? Click here!
By Mark Stephens. Mark is a Partner Solutions Architect who focuses on M&E.
The Media and Entertainment industry is made up of companies involved in the creation, production, marketing, distribution and monetization of content. We find that Media and Entertainment companies are increasingly taking advantage of the scalability, elasticity, and security of the AWS Cloud to enable their businesses in new ways.
Today, I’m going to recap what’s been happening in the Media and Entertainment industry, some of the AWS services that customers in particular segments use, how APN Partners are driving customer success in the segment, recent announcements from AWS that impact the Media and Entertainment space, and upcoming Media and Entertainment events.
To kick off, let’s review how we look at the Media and Entertainment industry and how we segment the workloads:
AWS Digital Media Competency Partner Solutions
AWS Digital Media Competency Partners have demonstrated success in building solutions to create, manage and distribute digital media and entertainment content. These Competency Partners give you access to innovative, cloud-based solutions for digital media projects and workflow.
Content Acquisition, Creation, and Processing: Acquisition, Editing, Mastering, VFX, and Rendering
The movie and entertainment industries are shifting content production and post-production to AWS to take advantage of highly scalable, elastic and secure cloud services that can enable you to accelerate content production and reduce capital infrastructure investment. Production and post-production companies are migrating VFX rendering and compute-intensive workloads to AWS to shorten content production times and foster collaboration with contributors from around the world. I’d like to highlight just a few ways companies can take advantage of AWS services to drive success:
- The AWS global infrastructure allows studios to globally collaborate and bring the content to the talent.
- Multi-tiered storage including Amazon Simple Storage Service (Amazon S3), S3 Infrequent Access, and Amazon Glacier (now with retrieval options including expedited retrievals 1-5 mins) allows for efficient storage and cost management of large media libraries. Amazon S3 allows for massively scalable ingest and the elasticity of cloud storage can help satisfy the ever-increasing demand for storage.
- Pay-as-you-go pricing for render software along with EC2 Spot Instances can provide a new way to cost effectively render in the cloud.
- Through a multi-tenant environment, multiple in-house networks can eliminate redundant infrastructure for activities like ingest and transcode. Services including Amazon DynamoDB, Amazon Cognito, Amazon S3, IAM Roles and Resource Tagging can help companies as they build for multi-tenancy. Amazon Kinesis, Amazon Elastic MapReduce (EMR), Amazon Redshift and Amazon QuickSight can be used for analytics or as a way to build a chargeback module.
- Amazon Simple Queue Service (SQS) helps companies as they look to build fault tolerant, scalable processors. FIFO queues are an addition that provides first-in, first-out delivery and exactly-once processing.
- Using AWS CloudFormation helps companies automate the deployment and management of AWS infrastructure and reduce human error.
I want to quickly call out Thinkbox Software, recently acquired by AWS. Thinkbox is extending Deadline, their render pipeline manager, to the cloud, so that any studio, large or small, will be able to seamlessly manage on-premises resources and cloud scale. Thinkbox’s Deadline software enables content creators to shorten production times and accelerate the creative process by automating the management and scaling of rendering jobs in the AWS Cloud. The Thinkbox Store allows for pay-as-you-go rendering using Redshift, Mental Ray, Maxwell, Nuke, and others.
APN Partners, don’t miss your opportunity to join us at the AWS Partner Summit – Seoul on April 19th, 2017! This event is free to attend, exclusive to APN Partners, and will focus on providing attendees with APN program updates and detailed information and guidance on building a successful cloud-based business.
The day kicks off at 9:00 am with two different tracks to choose from: Track one content is tailored to APN Partners who are looking to expand their AWS-based business, while Track two is focused on providing content for APN Partners just getting started. After a break from 10:20 – 10:40, the day continues with an opening address and then a keynote speech from Terry Wise, Vice President, Global Alliances, Ecosystem and Channels, AWS. Following the keynote are closing remarks and APN Partner awards. The day will close with a networking lunch from 12:00 – 13:00.
Click here to learn more about the benefits of attending the AWS Partner Summit – Seoul.
Are you interested in joining us? It’s not too late to register!