AWS Database Blog

Amazon DynamoDB at AWS re:Invent 2016—Wrap-Up

Vrutik Ghai is a product manager at Amazon Web Services

We wrapped up an exciting AWS re:Invent. It was great to interact with current and future Amazon DynamoDB customers and hear their feedback and suggestions.

Multiple re:Invent breakout sessions highlighted DynamoDB. These sessions consisted of deep dives, best practices, and customer talks with real-life examples from industries like gaming, adtech, IoT, and others.

In case you missed attending a session, following are links to the session recordings, along with session abstracts to give you an idea of what each session is about. We hope you find these videos useful as you leverage the performance and flexibility of DynamoDB for your applications.

  1. AWS Database State of the Union (DAT320)

    Speaker:
    Raju Gulabani, VP Database Services, AWS
    Session Abstract:
    In this session, Raju Gulabani, vice president of AWS Database Services (AWS), discusses the evolution of database services on AWS and the new database services and features we launched this year, and shares our vision for continued innovation in this space. We are witnessing an unprecedented growth in the amount of data collected, in many different shapes and forms. Storage, management, and analysis of this data requires database services that scale and perform in ways not possible before. AWS offers a collection of such database and other data services like Amazon Aurora, Amazon DynamoDB, Amazon RDS, Amazon Redshift, Amazon ElastiCache, Amazon Kinesis, and Amazon EMR to process, store, manage, and analyze data. In this session, we provide an overview of AWS database services and discuss how our customers are using these services today.
  2. Introduction to Managed Database Services on AWS (DAT307)

    Speakers:
    Steve Hunt, Director of Infrastructure, FanDuel
    Alan Murray, Director of Architecture, FanDuel
    Robin Spira, CTO, FanDuel
    Darin Briskman, AWS Database Services
    Session Abstract:
    In this session, we look at questions such as: Which database is best suited for your use case? Should you choose a relational database or NoSQL or a data warehouse for your workload? Would a managed service like Amazon RDS, Amazon DynamoDB, or Amazon Redshift work better for you, or would it be better to run your own database on Amazon EC2? FanDuel has been running its fantasy sports service on Amazon Web Services (AWS) since 2012. We learn best practices and insights from FanDuel’s successful migrations from self-managed databases on EC2 to fully managed database services.
  3. Deep Dive on Amazon DynamoDB (DAT304)

    Speaker:
    Rick Houlihan, Principal TPM, DBS NoSQL
    Session Abstract:
    In this session, we explore Amazon DynamoDB capabilities and benefits in detail and learn how to get the most out of your DynamoDB database. We go over best practices for schema design with DynamoDB across multiple use cases, including gaming, AdTech, IoT, and others. We explore designing efficient indexes, scanning, and querying, and go into detail on a number of recently released features, including JSON document support, DynamoDB Streams, and more. We also provide lessons learned from operating DynamoDB at scale, including provisioning DynamoDB for IoT.
  4. Migrating from RDBMS to NoSQL: How PlayStation Network Moved from MySQL to Amazon DynamoDB (DAT318)

    Speakers:
    Nate Slater, Senior Manager, AWS Solutions Architecture
    Benedikt Neuenfeldt, Architect, SIE Inc.
    Aki Kusumoto, VP of NPS Development Department, SIE Inc.
    Session Abstract:
    In this session, we talk about the key differences between a relational database management service (RDBMS) and nonrelational (NoSQL) databases like Amazon DynamoDB. You will learn about suitable and unsuitable use cases for NoSQL databases. You’ll learn strategies for migrating from an RDBMS to DynamoDB through a five-phase, iterative approach. See how Sony migrated an on-premises MySQL database to the cloud with Amazon DynamoDB, and see the results of this migration.
  5. Migrating a Highly Available and Scalable Database from Oracle to Amazon DynamoDB (ARC404)

    Speaker:
    Shreekant Mandke, Software Development Manager, Amazon Marketplace
    Session Abstract:
    In this session, we share how an Amazon.com team that owns a document management platform that manages billions of critical customer documents for Amazon.com migrated from a relational to a nonrelational database. Initially, the service was built as an Oracle database. As it grew, the team discovered the limits of the relational model and decided to migrate to a nonrelational database. They chose Amazon DynamoDB for its built-in resilience, scalability, and predictability. We provide a template that customers can use to migrate from a relational data store to DynamoDB. We also provided details about the entire process: design patterns for moving from a SQL schema to a NoSQL schema; mechanisms used to transition from an ACID (Atomicity, Consistency, Isolation, Durability) model to an eventually consistent model; migration alternatives considered; pitfalls in common migration strategies; and how to ensure service availability and consistency during migration.
  6. How Toyota Racing Development Makes Racing Decisions in Real Time with AWS (DAT311)

    Speakers:
    Jason Chambers, Toyota Racing Development
    Philip Loh, Toyota Racing Development
    Martin Sirull, AWS
    Session Abstract:
    In this session, we learn how Toyota Racing Development (TRD) developed a robust and highly performant real-time data analysis tool for professional racing. Learn how TRD structured a reliable, maintainable, decoupled architecture built around Amazon DynamoDB as both a streaming mechanism and a long-term persistent data store. In racing, milliseconds matter and even moments of downtime can cost a race. We see how TRD used DynamoDB together with Amazon Kinesis and Amazon Kinesis Firehose to build a real-time streaming data analysis tool for competitive racing.
  7. Streaming ETL for RDS and DynamoDB (DAT315)

    Speakers:
    Greg Brandt, Liyin Tang, Airbnb
    Session Abstract:
    In this session Greg Brandt and Liyin Tang, Data Infrastructure engineers from Airbnb, discuss the design and architecture of Airbnb’s streaming ETL infrastructure, which exports data from RDS for MySQL and DynamoDB into Airbnb’s data warehouse, using a system called SpinalTap. We also discuss how we leverage Apache Spark Streaming to compute derived data from tracking topics and/or database tables, and HBase to provide immediate data access and generate cleanly time-partitioned Hive tables.
  8. How DataXu Scaled Its Attribution System to Handle Billions of Events per Day with Amazon DynamoDB (DAT312)

    Speakers:
    Padma Malligarjunan, AWS
    Yekesa Kosuru, DataXu
    Rohit Dialani, DataXu
    Session Abstract:
    “Attribution” is the marketing term of art for allocating full or partial credit to individual advertisements that eventually lead to a purchase, sign up, download, or other desired consumer interaction. In this session, DataXu shares how they used DynamoDB at the core of their attribution system to store terabytes of advertising history data. The system is cost-effective and dynamically scales from 0 to 300K requests per second on demand with predictable performance and low operational overhead.
  9. Cross-Region Replication with Amazon DynamoDB Streams (DAT201)

    Speakers:
    Carl Youngblood, Lead Engineer, Under Armour
    Prahlad Rao, Solutions Architect, AWS
    Session Abstract:
    In this session, Carl Youngblood, Lead Engineer of Under Armour, shares the keys to success as Under Armour implemented cross-region replication with Amazon DynamoDB Streams. The session also includes a quick recap of DynamoDB and its features.
  10. Building Real-Time Campaign Analytics Using AWS Services (DAT310)

    Speakers:
    Radhika Ravirala, Solutions Architect, AWS
    Nabil Zaman, Software Engineer, Quantcast
    Session Abstract:
    In this session, we talk about how Quantcast used AWS services including DynamoDB to implement real-time campaign analytics. Quantcast provides its advertising clients the ability to run targeted ad campaigns reaching millions of online users. The real-time bidding for campaigns runs on thousands of machines across the world. When Quantcast wanted to collect and analyze campaign metrics in real-time, they turned to AWS to rapidly build a scalable, resilient, and extensible framework. Quantcast used Amazon Kinesis streams to stage data, Amazon EC2 instances to shuffle and aggregate the data, and Amazon DynamoDB and Amazon ElastiCache for building scalable time-series databases. With Elastic Load Balancing and Auto Scaling groups, they are able to set up distributed microservices with minimal operation overhead. This session discusses their use case, how they architected the application with AWS technologies integrated with their existing home-grown stack, and the lessons they learned.
  11. How Fulfillment by Amazon (FBA) and Scopely Improved Results and Reduced Costs with a Serverless Architecture (DAT309)

    Speakers:
    Vlad Vlasceanu, Ganesh Subramaniam & Brandon Cuff, AWS
    Session Abstract:
    In this session, we share an overview of leveraging serverless architectures to support high performance data intensive applications. Fulfillment by Amazon (FBA) built the Seller Inventory Authority Platform (IAP) using Amazon DynamoDB Streams, AWS Lambda functions, Amazon Elasticsearch Service, and Amazon Redshift to improve results and reduce costs.  Scopely shares how they used a flexible logging system built on Amazon Kinesis, Lambda, and Amazon Elasticsearch Service to provide high-fidelity reporting on hotkeys in Memcached and DynamoDB, and drastically reduce the incidence of hotkeys. Both of these customers are using managed services and serverless architecture to build scalable systems that can meet the projected business growth without a corresponding increase in operational costs.
  12. 6 Million New Registrations in 30 Days: How the Chick-fil-A One App Scaled with AWS (DAT313)

    Speakers:
    Chris Taylor, Director, Customer Experience Architecture, Chick-fil-A
    Andrew Baird, Solutions Architect, AWS
    Session Abstract:
    In this session, Chris Taylor from Chick-fil-A shares how they managed to scale using AWS services. Chris leads the team providing back-end services for the massively popular Chick-fil-A One mobile app that launched in June 2016. Chick-fil-A follows AWS best practices for web services and leverages numerous AWS services, including AWS Elastic Beanstalk, Amazon DynamoDB, AWS Lambda, and Amazon S3. This was the largest technology-dependent promotion in Chick-fil-A history. To ensure their architecture would perform at unknown and massive scale, Chris worked with AWS Support through an AWS Infrastructure Event Management (IEM) engagement and leaned on automated operations to enable load testing before launch.
  13. How Telltale Games Migrated Its Story Analytics from Apache CouchDB to Amazon DynamoDB (DAT316)

    Speakers:
    Zac Litton, VP of Engineering, Telltale Games
    Greg McConnel, Solutions Architect, AWS
    Session Abstract:
    In this session, you’ll learn about Telltale Games’ migration from Apache CouchDB to Amazon DynamoDB, the challenges of adjusting capacity to handling spikes in database activity, and how Telltale Games streamlined its analytics storage to provide new perspectives of player interaction to improve its games. Every choice made in Telltale Games titles influences how your character develops and how the world responds to you. With millions of users making thousands of choices in a single episode, Telltale Games tracks this data and leverages it to build more relevant stories in real time as the season is developed.
  14. Capturing Windows of Opportunity: Real-Time Analytics for Less Than $1000? (DAT208)

    Speakers:
    Craig Stires, Head of Big Data & Analytics APAC, AWS
    Session Abstract:
    In this session, we look at how some AWS customers are using real-time analytics to capture windows of opportunity: a telco with a major promotion, an advertising retargeter with global demands, and a personal IoT provider with a lifestyle solution. We dig deeper into their architecture and look for common patterns that can be used to build a real-time analytics platform in a cost-optimized way. We even see how a light-load, real-time analytics system can be built for less than $1000.