AWS Official Blog
Building world-class games is a very difficult, time-consuming, and expensive process. The audience is incredibly demanding. They want engaging, social play that spans a wide variety of desktop, console, and mobile platforms. Due to the long lead time inherent in the game development and distribution process, the success or failure of the game can often be determined on launch day, when pent-up demand causes hundreds of thousands or even millions of players to sign in and take the game for a spin.
Behind the scenes, the development process must be up to this challenge. Game creators must be part of a team that includes developers with skills in story telling, game design, physics, logic design, sound creation, graphics, visual effects, and animation. If the game is network-based, the team must also include expertise in scaling, online storage, network communication & management, security.
With development and creative work that can take 18 to 36 months, today’s games represent a considerable financial and reputational risk for the studio. Each new game is a make-or-break affair.
New AWS Game Services
Today I would like to tell you about a pair of new AWS products that are designed for use by professional game developers building cloud-connected, cross-platform games. We started with several proven, industry leading engines and developer tools, added a considerable amount of our own code, and integrated the entire package with our Twitch video platform and community, while also mixing in access to relevant AWS messaging, identity, and storage services. Here’s what we are announcing today:
Lumberyard – A game engine and development environment designed for professional developers. A blend of new and proven technologies from CryEngine, Double Helix, and AWS, Lumberyard simplifies and streamlines game development. As a game engine, it supports development of cloud-connected and standalone 3D games, with support for asset management, character creation, AI, physics, audio, and more. On the development side, the Lumberyard IDE allows you to design indoor and outdoor environments, starting from a blank canvas. You (I just promoted you to professional game developer) can take advantage of built-in content workflows and an asset pipeline, editing game assets in Photoshop, Maya, or 3ds Max for editing and bringing them in to the IDE afterward. You can program your game in the traditional way using C++ and Visual Studio (including access to the AWS SDK for C++) or you can use our Flow Graph tool and the cool new Cloud Canvas to create cloud-connected gameplay features using visual scripting.
Amazon GameLift – Many modern games include a server or backend component that must scale in proportion to the number of active sessions. Amazon GameLift will help you to deploy and scale session-based multiplayer game servers for the games that you build using Lumberyard. You simply upload your game server image to AWS and deploy the image into a fleet of EC2 instances that scales up as players connect and play. You don’t need to invest in building, scaling, running, or monitoring your own fleet of servers. Instead, you pay a small fee per daily active user (DAU) and the usual EC2 On-Demand rates for the compute capacity, EBS storage, and bandwidth that your users consume.
Twitch Integration – Modern gamers are a very connected bunch. When they are not playing themselves, they like to connect and interact with other players and gaming enthusiasts on Twitch. Professional and amateur players display their talents on Twitch and create large, loyal fan bases. In order to take this trend even further and to foster the establishment of deeper connections and stronger communities, games built with Lumberyard will be able to take advantage of two new Twitch integration features. Twitch ChatPlay allows you to build games that respond to keywords in a Twitch chat stream. For example, the audience can vote to have the player take the most desired course of action. Twitch JoinIn allows a broadcaster to invite a member of the audience into to the game from within the chat channel.
These services, like many other parts of AWS, are designed to allow you to focus on the unique and creative aspects of your game, with an emphasis on rapid turnaround and easy iteration so that you can continue to hone your gameplay until it reaches the desired level of engagement and fun.
Support Services – As the icing on this cake, we are also launching a range of support options including a dedicated Lumberyard forum and a set of tutorials (both text and video). Multiple tiers of paid AWS support are also available.
Developing with Lumberyard
Lumberyard is at the heart of today’s announcement. As I mentioned earlier, it is designed for professional developers and supports development of high-quality, cross-platform games. We are launching with support for the following environments:
- Windows – Vista, Windows 7, 8, and 10.
- Console – PlayStation 4 and Xbox One.
Support for mobile devices and VR headsets is in the works and should be available within a couple of months.
The Lumberyard development environment runs on your Windows PC or laptop. You’ll need a fast, quad-core processor, at least 8 GB of memory, 200 GB of free disk space, and a high-end video card with 2 GB or more of memory and Direct X 11 compatibility. You will also need Visual Studio 2013 Update 4 (or newer) and the Visual C++ Redistributables package for Visual Studio 2013.
The Lumberyard Zip file contains the binaries, templates, assets, and configuration files for the Lumberyard Editor. It also includes binaries and source code for the Lumberyard game engine. You can use the engine as-is, you can dig in to the source code for reference purposes, or you can customize it in order to further differentiate your game. The Zip file also contains the Lumberyard Launcher. This program makes sure that you have properly installed and configured Lumberyard and the third party runtimes, SDKs, tools, and plugins.
The Lumberyard Editor encapsulates the game under development and a suite of tools that you can use to edit the game’s assets.
The Lumberyard Editor includes a suite of editing tools (each of which could be the subject of an entire blog post) including an Asset Browser, a Layer Editor, a LOD Generator, a Texture Browser, a Material Editor, Geppetto (character and animation tools), a Mannequin Editor, Flow Graph (visual programming), an AI Debugger, a Track View Editor, an Audio Controls Editor, a Terrain Editor, a Terrain Texture Layers Editor, a Particle Editor, a Time of Day Editor, a Sun Trajectory Tool, a Composition Editor, a Database View, and a UI Editor. All of the editors (and much more) are accessible from one of the toolbars at the top.
In order to allow you to add functionality to your game in a selective, modular form, Lumberyard uses a code packaging system that we call Gems. You simply enable the desired Gems and they’ll be built and included in your finished game binary automatically. Lumberyard includes Gems for AWS access, Boids (for flocking behavior), clouds, game effects, access to GameLift, lightning, physics, rain, snow, tornadoes, user interfaces, multiplayer functions, and a collection of woodlands assets (for detailed, realistic forests).
Coding with Flow Graph and Cloud Canvas
Traditionally, logic for games was built by dedicated developers, often in C++ and with the usual turnaround time for an edit/compile/run cycle. While this option is still open to you if you use Lumberyard, you also have two other options: Lua and Flow Graph.
Flow Graph is a modern and approachable visual scripting system that allows you to implement complex game logic without writing or or modifying any code. You can use an extensive library of pre-built nodes to set up gameplay, control sounds, and manage effects.
Flow graphs are made from nodes and links; a single level can contain multiple graphs and they can all be active at the same time. Nodes represent game entities or actions. Links connect the output of one node to the input of another one. Inputs have a type (Boolean, Float, Int, String, Vector, and so forth). Output ports can be connected to an input port of any type; an automatic type conversion is performed (if possible).
There are over 30 distinct types of nodes, including a set (known as Cloud Canvas) that provide access to various AWS services. These include two nodes that provide access to Amazon Simple Queue Service (SQS), four nodes that provide access to Amazon Simple Notification Service (SNS), seven nodes that provide read/write access to Amazon DynamoDB, one to invoke an AWS Lambda function, and another to manage player credentials using Amazon Cognito. All of the games calls to AWS are made via an AWS Identity and Access Management (IAM) user that you configure in to Cloud Canvas.
Here’s a node that invokes a Lambda function named DailyGiftLambda:
Here is a flow graph that uses Lambda and DynamoDB to implement a “Daily Gift” function:
As usual, I have barely scratched the surface here! To learn more, read the Cloud Canvas documentation in the Lumberyard User Guide.
Deploying With Amazon GameLift
If your game needs a scalable, cloud-based runtime environment, you should definitely take a look at Amazon GameLift.
You can use it to host many different types of shared, connected, regularly-synchronized games including first-person shooters, survival & sandbox games, racing games, sports games, and MOBA (multiplayer Online Battlefield Arena) games.
After you build your server-side logic, you simply upload it to Amazon GameLift. It will be converted to a Windows-based AMI (Amazon Machine Image) in a matter of minutes. Once the AMI is ready, you can create an Amazon GameLift fleet (or a new version of an existing one), point it at the AMI, and your backend will be ready to go.
Your fleets, and the game sessions, running on each fleet, are visible in the Amazon GameLift Console:
Your Flow Graph code can use the GameLift Gem to create an Amazon GameLift session and to start the session service.
To learn more, consult the Amazon GameLift documentation.
Last but definitely not least, your games can integrate with Twitch via Twitch ChatPlay and Twitch JoinIn.
As I mentioned earlier, you can create games that react to keywords entered in a designated Twitch channel. For example, here’s a Flow Graph that listens for the keywords red, yellow, blue, green, orange, and violet.
Pricing and Availability
Lumberyard and Amazon GameLift are available now and you can start building your games today!
You can build and run connected and standalone games using Lumberyard at no charge. You are responsible for the AWS charges for any calls made to AWS services using the IAM user configured in to Cloud Canvas, or through calls made using the AWS SDK for C++, along with any charges for the use of GameLift.
Amazon GameLift is launching in the US East (Northern Virginia) and US West (Oregon) regions, and will be coming to other AWS regions as well. As part of AWS Free Usage tier, you can run a fleet comprised of one c3.large instance for up to 125 hours per month for a period of one year. After that, you pay the usual On-Demand rates for the EC2 instances that you use, plus the charge for 50 GB / month of EBS storage per instance, and $1.50 per month for every 1000 daily active users.— Jeff;
Let’s take a quick look at what happened in AWS-land last week:
New & Notable Open Source
- goad is an AWS Lambda powered, highly distributed, load testing tool.
- python-lambder lets you create and manage scheduled AWS Lambdas from the command line.
- sevenseconds is an AWS account configurator.
- Zappa implements serverless WSGI with AWS Lambda and API Gateway.
- ctl-transcode transcodes videos using AWS.
- AWS is a set of PowerShell scripts, functions, and modules for managing AWS.
- ssc-lambda is a set of SSC Lambda functions for AWS processing.
- rifactor can automatically refactor your AWS Reserved Instances to match your running instances.
- Eureka is an AWS service registry for resilient mid-tier load balancing and failover.
- lamvery is a function-based deployment and management tool for AWS Lambda.
New SlideShare Presentations
- AWS Security Day:
- AWS January 2016 Webinars:
- Introduction to Docker on AWS.
- Amazon Aurora for Enterprise Database Applications.
- Real-World Smart Applications with Amazon Machine Learning.
- Infrastructure as Code.
- Best Practices for Building IoT Backends with AWS IoT and AWS Lambda.
- Cloud Data Migration: 6 Strategies for Getting Data into AWS.
- AWS IoT – Getting Started.
- Getting Started with Big Data: Analytic Options on AWS & Common Use Cases.
- Introduction to Deploying Applications on AWS.
New Customer Success Stories
- Career Builder – By automating its software release process using AWS CodePipeline and AWS CodeDeploy, CareerBuilder increases update speed and assures quality code, freeing up developers to focus on the core product.
- edotco Group – edotco Group has achieved availability well in excess of its service level agreement of 99.95 percent and reduced infrastructure costs by at least 50 percent over five years using AWS.
- KeptMe – By launching its service on AWS, KeptMe was able to quickly expand to more than 4,000 schools in nine different countries.
- Open Universities Australia – By moving its collocated data center to AWS, OUA reduced the the time required to deliver changes to production from three months to less than two hours, cut costs by up to AU$1 million (US$726,850) over two years, and improved the performance of its websites by up to 20 percent.
- Sokrati – By using AWS, Sokrati reduced the data in its database from 20 terabytes to 2 terabytes and reduced its infrastructure costs by 35 percent.
- Time Inc. – Time Inc. uses AWS Enterprise Support to assist with planning and executing the migration of existing and new applications to AWS.
- 91App – Using AWS has enabled 91App to create and launch digital campaigns in just 24 hours, compared to the several weeks the same processes would have required with a physical IT infrastructure.
- Air Works – By using AWS, Air Works has improved its operational performance by 84 percent and its response times by 160 percent.
- Autodesk – Autodesk can monitor and control the use of hundreds of AWS accounts from a single pane of glass.
- GENALICE – GENALICE uses AWS to run the Population Calling module of its GENALICE MAP Next-Generation Sequencing data analysis suite.
- The Guardian – Guardian News and Media increased the velocity of releases for its digital properties to 40,000 in 2015, up from 25 in 2012, by using AWS.
- Jelly Button Games – Jelly Button Games, an Israeli social gaming company, can grow its business while its AWS environment handles up to one million game server requests a minute.
- Lyft – By using Spot, the startup saves up to 75 percent monthly versus on-demand instances for routine testing processes that do not require the most current or most powerful compute resources.
New YouTube Videos
- Introduction to Amazon Route 53.
- Introduction to AWS Training & Certification.
- Introduction to AWS Snowball.
- Container Day:
- February 11 – Partner Webinar – Finding the Hidden Waste in Your AWS Infrastructure.
- AWS Loft – San Francisco.
- AWS Loft – New York.
- AWS Global Summit Series.
Let’s take a quick look at what happened in AWS-land last week:
My colleague Mike Stroh is part of our sales training team. He wrote the guest post below to introduce you to our newest AWS training courses.— Jeff;
We routinely tweak our 3-day AWS technical training courses to keep pace with AWS platform updates, incorporate learner feedback, and the latest best practices.
Today I want to tell you about some exciting enhancements to Developing on AWS. Whether you’re moving applications to AWS or developing specifically for the cloud, this course can show you how to use the AWS SDK to create secure, scalable cloud applications that tap the full power of the platform.
We’ve made a number of updates to the course—most stem directly from the experiences and suggestions of developers who took previous versions of the course. Here are some highlights of what’s new:
- Balance of Concepts and Code – The updated course expands coverage of key concepts, best practices, and troubleshooting tips for AWS services to help students build a mental model before diving into code. Students then use an AWS SDK to develop apps that apply these concepts in hands-on labs.
- AWS SDK Labs – Practice labs are designed to emphasize the AWS SDK, reflecting how developers actually work and create solutions. Lab environments now include EC2 instances preloaded with all required programming language SDKs, developer tools, and IDEs. Students can simply log in and start learning!
- Relevant to More Developers – The additional programming language support helps make the course more useful to both startup and enterprise developers.
- Expanded Coverage of Developer-Oriented AWS Services – The updated course put more focus on the AWS services relevant to application development. So there’s expanded coverage of Amazon DynamoDB, plus new content on AWS Lambda, Amazon Cognito, Amazon Kinesis Streams, Amazon ElastiCache, AWS CloudFormation, and others.
Here’s a map that will help you to understand how the course flows from topic to topic:
— Mike Stroh, Content & Community Manager
Regular readers of this blog will know that I am a huge fan of Amazon WorkSpaces. In fact, after checking my calendar, I verified that every blog post I have written in the last 10 months has been done from within my WorkSpace. Regardless of my location—office, home, or hotel room—performance, availability, and functionality have all been excellent. Until you have experienced a persistent, cloud-based desktop for yourself you won’t know what you are missing!
Today, I am pleased to be able to tell you about three new features for WorkSpaces, each designed to make the service even more useful:
- Audio-In – You can now make and receive calls from your WorkSpace using popular communication tools such as Lync, Skype, and WebEx.
- High DPI Device Support – You can now take advantage of High DPI displays found on devices like the Surface Pro 4 tablet and the Lenovo Yoga laptop.
- Saved Registration Codes – You can now save multiple registration codes in the same client application.
Being able to make and to receive calls from your desktop can boost your productivity. Using the newest WorkSpaces clients for Windows and Mac, you can make and receive calls using popular communication tools like Lync, Skype, and WebEx. Simply connect an analog or USB audio headset to your local client device and start making calls! This functionality is enabled for all newly launched WorkSpaces; existing WorkSpaces may need a restart. With the launch of this feature, voice communication with headsets is available to you at no additional charge in all regions where WorkSpaces are available today.
When a WorkSpace is created using a custom image, the audio-in updates are applied during the provisioning process and will take some time. To avoid this, you (or your WorkSpaces administrator) can create a new custom image after the updates have been applied to an existing WorkSpace.
High DPI Devices
To support the increasing popularity of high DPI (Full HD, Ultra HD, and QHD+) displays, we added the ability to automatically scale the in-session experience of WorkSpaces to match your local DPI settings. This means that fonts and icon sizes will match your preferred settings on high DPI devices making the WorkSpaces experience more natural. Simply use the newest WorkSpaces clients for Windows and Mac and enjoy this enhancement immediately.
Saved Registration Codes
Many customers access multiple WorkSpaces spread across several directories and/or regions and would prefer not to have to copy and paste registration codes to make the switch. You can now save up to 10 registration codes within the client application, and switch between them with a couple of clicks. You can control all of this through the new Manage Registrations screen:
To learn more about Amazon WorkSpaces, visit the Amazon WorkSpaces page.— Jeff;
In the early days of AWS, customers were happy to simply learn about the cloud and its benefits. As they started to learn more, the conversation shifted. It went from “what is the cloud” to “what kinds of security does the cloud offer” to “”how can I use the cloud” over the course of just 6 or 7 years. As the industry begins to mature, enterprise and government customers are now interested in putting the cloud to use in a form that complies with applicable standards and recommendations.
For example, National Institute of Standards and Technology (NIST) Special Publication 800-53 (Security and Privacy Controls for Federal Information Systems and Organizations) defines a set of information and security controls that are designed to make systems more resilient to many different types of threats. This document is accompanied by a set of certifications, accreditations, and compliance processes.
New Compliance Offerings
In order to simplify the task of building a system that is in accord with compliance standards of this type, we will be publishing a series of AWS Enterprise Accelerator – Compliance Quick Starts. These documents and CloudFormation templates are designed to help Managed Service Organizations, cloud provisioning teams, developers, integrators, and information system security officers.
The new AWS Enterprise Accelerator – Compliance: Standardized Architecture for NIST 800-53 on the AWS Cloud is our first offering in this series!
The accelerator contains a set of nested CloudFormation templates. Deploying the top-level template takes about 30 minutes and creates all of the necessary AWS resources. The resources include three Virtual Private Clouds (VPCs)—Management, Development, and Production—suitable for running a multi-tier Linux-based application.
The template also creates the necessary IAM roles and custom policies, VPC security groups, and the like. It launches EC2 instances and sets up an encrypted, Multi-AZ MySQL database (using Amazon Relational Database Service (RDS)) in the Development and Production VPCs.
The architecture defined by this template makes use AWS best practices for security and availability including the use of a Multi-AZ architecture, isolation of instances between public and private subnets, monitoring & logging, database backup, and encryption.
You also have direct access to the templates. You can download them, customize them, and extract interesting elements for use in other projects.
You can also add the templates for this Quick Start to the AWS Service Catalog as portfolios or as products. This will allow you to institute a centrally managed model, and will help you to support consistent governance, security, and compliance.— Jeff;
For Episode 134 of the AWS podcast, I spoke with Bob Rogers, PhD, Chief Data Scientist for Big Data Solutions at Intel Corporation. We talked about how Bob entered the field of data science, how to get value from data science projects, and some misconceptions around big data. You can listen to the podcast to learn what skills are needed to have a career as a data scientist, and you can also hear Bob’s tips for those looking to become one. Hear what Intel is doing in the big data and analytics space – from the silicon chip to the cloud, and what big data holds for the future.
That’s the last of my 2015 recordings. We’ll be back with more episodes soon. Thanks for listening!— Jeff;
PS – Intel asked that we add the following disclaimer:
- Intel technologies’ features and benefits depend on system configuration and may require enabled hardware, software or service activation. Learn more at intel.com, or from the OEM or retailer.
- No computer system can be absolutely secure.
- Statements in this document that refer to Intel’s plans and expectations for the quarter, the year, and the future, are forward-looking statements that involve a number of risks and uncertainties. A detailed discussion of the factors that could affect Intel’s results and plans is included in Intel’s SEC filings, including the annual report on Form 10-K.
- Intel and the Intel logo are trademarks of Intel Corporation in the United States and/or other countries.
My colleague Jon Fritz wrote the blog post below to introduce you to some new features of Amazon EMR.
Today we are announcing Amazon EMR release 4.3.0, which adds support for Apache Hadoop 2.7.1, Apache Spark 1.6.0, Ganglia 3.7.2, and a new sandbox release for Presto (0.130). We have also enhanced our maximizeResourceAllocation setting for Spark and added an AWS CLI Export feature to generate a create-cluster command from the Cluster Details page in the AWS Management Console.
New Applications in Release 4.3.0
Amazon EMR provides an easy way to install and configure distributed big data applications in the Hadoop and Spark ecosystems on managed clusters of Amazon EC2 instances. You can create Amazon EMR clusters from the Amazon EMR Create Cluster Page in the AWS Management Console, AWS Command Line Interface (CLI), or using a SDK with an EMR API. In the latest release, we added support for several new versions of the following applications:
- Spark 1.6.0 – Spark 1.6.0 was released on January 4th by the Apache Foundation, and we’re excited to include it in Amazon EMR within four weeks of open source GA. This release includes several new features like compile-time type safety using the Dataset API (SPARK-9999), machine learning pipeline persistence using the Spark ML Pipeline API (SPARK-6725), a variety of new machine learning algorithms in Spark ML, and automatic memory management between execution and cache memory in executors (SPARK-10000). View the release notes or learn more about Spark on Amazon EMR.
- Presto 0.130 – Presto is an open-source, distributed SQL query engine designed for low-latency queries on large datasets in Amazon S3 and HDFS. This is a minor version release, with optimizations to SQL operations and support for S3 server-side and client-side encryption in the PrestoS3Filesystem. View the release notes or learn more about Presto on Amazon EMR.
- Hadoop 2.7.1 – This release includes improvements to and bug fixes in YARN, HDFS, and MapReduce. Highlights include enhancements to FileOutputCommitter to increase performance of MapReduce jobs with many output files (MAPREDUCE-4814) and adding support in HDFS for truncate (HDFS-3107) and files with variable-length blocks (HDFS-3689). View the release notes or learn more about Amazon EMR.
- Ganglia 3.7.2 – This release includes new features such as building custom dashboards using Ganglia Views, setting events, and creating new aggregate graphs of metrics. Learn more about Ganglia on Amazon EMR.
Enhancements to the maximizeResourceAllocation Setting for Spark
Currently, Spark on your Amazon EMR cluster uses the Apache defaults for Spark executor settings, which are 2 executors with 1 core and 1GB of RAM each. Amazon EMR provides two easy ways to instruct Spark to utilize more resources across your cluster. First, you can enable dynamic allocation of executors, which allows YARN to programmatically scale the number of executors used by each Spark application, and adjust the number of cores and RAM per executor in your Spark configuration. Second, you can specify maximizeResourceAllocation, which automatically sets the executor size to consume all of the resources YARN allocates on a node and the number of executors to the number of nodes in your cluster (at creation time). These settings create a way for a single Spark application to consume all of the available resources on a cluster. In release 4.3.0, we have enhanced this setting by automatically increasing the Apache defaults for driver program memory based on the number of nodes and node types in your cluster (more information about configuring Spark).
AWS CLI Export in the EMR Console
You can now generate an EMR create-cluster command representative of an existing cluster with a 4.x release using the AWS CLI Export option on the Cluster Details page in the AWS Management Console. This allows you to quickly create a cluster using the Create Cluster experience in the console, and easily generate the AWS CLI script to recreate that cluster from the AWS CLI.
Launch an Amazon EMR Cluster with Release 4.3.0 Today
To create an Amazon EMR cluster with 4.3.0, select release 4.3.0 on the Create Cluster page in the AWS Management Console, or use the release label emr-4.3.0 when creating your cluster from the AWS CLI or using a SDK with the EMR API.
— Jon Fritz, Senior Product Manager, Amazon EMR
Early in my career I worked for several companies that developed and shipped (on actual tapes) packaged software. Back in those pre-Internet days, marketing, sales, and distribution were all done on a country-by-country basis. This often involved setting up a field office and hiring local staff, both of which were expensive, time-consuming, and somewhat speculative. Providing prospective customers with time-limited access to trial copies was also difficult for many reasons including hardware and software compatibility, procurement & licensing challenges, and all of the issues that would inevitably arise during installation and configuration.
Today, the situation is a lot different. Marketing, sales, and distribution are all a lot simpler and more efficient, thanks to the Internet. For example, AWS Marketplace has streamlined the procurement process. With ready access to a very wide variety of commercial and open source software products from ISVs, customers can find what they want, buy it, and deploy it to AWS in minutes, with just a few clicks. Because many of the products in AWS Marketplace include a free trial and/or an hourly pricing option, potential large-scale users can take the products for a spin and make sure that they will satisfy their needs.
Support for the Asia Pacific (Seoul) Region
Now that the new Asia Pacific (Seoul) Region is up and running, customers located in Korea, as well as global companies serving Korean end users, can take advantage of the AWS Marketplace. There are now more than 600 products available for 1-click deploy in categories such as Network Infrastructure, Security, Storage, and Business Intelligence.
These products are available under several different pricing plans including free, hourly, monthly, and annual. For companies that already own applicable licenses for the desired products, a BYOL (Bring Your Own License) option is also available.
As I write this, more than 150 products are available for free trials in the Asia Pacific (Seoul) Region!
Several Korean ISVs have already listed their products on AWS Marketplace. Here’s a sampling:
- TMAXSoft – Tibero5 (paid AMI).
- Gruter – Enterprise Tajo (paid AMI).
- PentaSecurity – CloudBric (SaaS).
If you are a software vendor or developer and would like to list your products in AWS Marketplace, please take a look at the Sell on AWS Marketplace information. Customers will be able to launch your products in minutes and pay for it as part of the regular AWS billing system. As a vendor of products that are available in AWS Marketplace, you will be able to discover new customers and benefit from a shorter sales cycle.
Let’s take a quick look at what happened in AWS-land last week:
New & Notable Open Source
- lambda-ec2-switch-timer can automatically stop and start Amazon EC2 instances using AWS Lambda.
- kinesis-deaggregation is a set of AWS Lambda modules for working with the Kinesis Producer Library.
- slackBot is a slackBot featuring AWS Lambda.
- aws-lambda-scala-example-product is an AWS Lambda function in Scala reading events from Amazon Kinesis and writing event counts to DynamoDB.
- popeye will generate an authorized_keys file from users stored in AWS IAM.
- AwsProxy is a proxy for the AWS SDK for PHP.
- aws-s3-mount can mount an s3 folder into a container and export it as a volume.
- backbeam-lambda is a set of development tools for creating web applications based on AWS Lambda.
- rom-dynamo is an AWS DynamoDB adapter for Ruby Object Mapper.
- lambda-refarch-iotbackend is an AWS Lambda reference architecture for creating an IoT backend.
New SlideShare Presentations
- Large-Scale AWS Migrations.
- Introduction to Cloud Computing with Amazon Web Services and Customer Case Study.
- Building an Amazon Datawarehouse and Using Business Intelligence Analytics Tools.
- Build A Website on AWS for Your First 10 Million Users.
New Customer Success Stories
- FINRA -By migrating to AWS, FINRA—the Financial Industry Regulatory Authority—has created a flexible platform that can adapt to changing market dynamics while providing its analysts with the tools to interactively query multi-petabyte data sets.
- Redfin – By using AWS, Redfin can innovate quickly and cost effectively with a small IT staff while managing billions of property records.
- Robinhood – Robinhood’s lean staff, including just two DevOps people, used AWS to create a massively scalable securities trading app with strong built-in security and compliance features that supported hundreds of thousands of users at launch.
- Zynga – By returning to AWS, Zynga is gaining greater agility, lower costs, and the freedom to experiment with new solutions to deliver world-class game experiences.
New YouTube Videos
Upcoming Events at the AWS Loft (San Francisco)
- January 26 – Behind the Scenes with Runscope.
- January 29 – Intro to AWS Lambda.
- January 29 – Container Day:
- Introduction to Docker on AWS.
- Migrating from Fleet to AWS.
- Behind the Scenes with Convox.
- February 3 – General Assembly Meet and Hire Happy Hour.
Upcoming Events at the AWS Loft (New York)
- January 27 – From Monolithic to Microservices: Evolving Architecture Patterns at Gilt.
- January 28 – Building An Automated Security Fabric In The AWS Cloud with Trend Micro.