AWS Innovate Data Edition
Get the most out of your data and
drive continuous innovation today


Sessions in 4 languages
Ask the
Live 1:1 Q&A
Certificate of
Level up your skills
Use cases
Technical demos
Better Together With Intel

 Asia Pacific & Japan

Harness data to its full potential

Today, organizations are managing more data than ever before. Each organization has different data sources, analytics needs, and governance requirements which are dynamic and can change over time.

At this online conference, learn from AWS experts on how a modern data strategy can support your present and future use cases including steps to build an end-to-end data solution to store and access, analyze and visualize, as well as predict. Uncover insights from business leaders on how they are transforming with data to become a data-driven organization.

Build a data-driven organization of tomorrow

Data can be an invaluable source of potential growth. The key is recognizing its inherent value, leveraging it intelligently, and creating a culture that embraces the power of being data-driven. Find out what it takes to be a data-driven organization and get inspired by organizations on how they are reinventing with data.

Empowering builders: Modernize, unify, and innovate with data

A modern data strategy gives you a comprehensive plan to manage, access, analyze, and act on data. Join us as AWS experts walk you through the steps, demos, and best practices to modernize, unify, and innovate with your data to drive actionable insights and create new customer experiences.


Get inspired and learn how you can use data to accelerate innovation and drive greater agility and efficiency for your organization. Dive deep into any of the 80+ business and technical sessions led by AWS experts as they share key concepts, business use cases, and best practices to help you save time and costs managing data, eliminate data silos, gain accurate insights faster, and build a strong data foundation for rapid innovation.

Agenda overview
 Download Agenda at a Glance »


  • Opening keynote
  • Opening keynote

    Opening keynote

    Innovate faster: Reinvent your organization with data (Level 100)
    Data is dynamic and comes in different formats, which makes it challenging to extract value. A modern data strategy can help you manage, act on, and react to your data so you can make better decisions, respond faster, and uncover new opportunities. Uncover the latest in database, data, analytics and AI/ML and get insights on how organizations are harnessing the power of data to accelerate innovation in their organization. Jumpstart and build the modern data strategy that allows you to consolidate, store, curate, and analyze data at any scale, as well as share data insights with everyone who needs them.

  • Data-driven organizations track 1
  • Data-driven organizations track 1

    Data-driven organizations track 1

    About the track

    Get inspired and learn how organizations are using AWS to solve business challenges, optimize business performance, and innovate faster. Start leveraging your data as a strategic asset and reinvent your organization with data today.

    Building a data culture within an organization (Level 100)
    Organizations want to unleash the value of data to increase agility, drive innovation, and improve efficiency. While data is abundant and growing rapidly, just producing or storing a lot of it does not automatically create value. Value is realized by creating a culture and an operating model that uses data to invent on behalf of customers using actionable insights, analytics, and AI and machine learning. However, cultural challenges, outdated governance models, organizational silos, and legacy execution approaches stand in the way of realizing this vision. Join this session to hear from Dr. Chris Marshall, Associate Vice President, IDC Asia Pacific, on what is a data culture including components of a good data culture, dividends from investing in a data culture and common challenges.

    Speaker: Dr. Chris Marshall, Associate Vice President, Analytics, Big Data and Artificial Intelligence, IDC Asia Pacific
    Duration: 30mins

    Becoming a data-driven organization of tomorrow (Level 100)
    Data is at the core of every business and organization to make informed decisions, look around corners, and take meaningful actions. Join this fireside chat to learn strategies rooted in the first-hand experience from the CXO of UnionBank of the Philippines and how he works to create a data-driven culture and turn the abundance of available data into better business outcomes.

    John Clarke, Director, Enterprise Strategy, AWS
    David R. Hardoon, Chief Data & AI Officer, UnionBank of the Philippines

    Duration: 30mins

    Driving sustainability with clean energy, AI, and data at Amazon’s Climate Pledge Arena (Level 100)
    Amazon is the world’s biggest corporate buyer of renewable energy and needs to ensure sustainability is at the heart of all of its operations in order to meet its carbon emission targets. An area it is working on is with Seattle Kraken to build solutions and help make its Climate Pledge Arena the most progressive, responsible, and sustainable in the world. Join this session to learn how AWS professional services and Amazon sustainability teams are using AWS services to ingest and analyze energy, water, and air-quality data. Get insights on how they build real-time forecasting models with data discovery, security, and design patterns at the heart.

    Rahul Sareen, Global Practice Manager, Sustainability, AWS
    Rob Johnson, Climate Pledge Arena, VP Sustainability & Transportation

    Duration: 30mins

    Productize your data to deliver new value and revenue (Level 100)
    Data driven organizations are well-positioned to productize data and bring new data products to market. Data is critical to any transformation to software as a service (SaaS). Learn how AWS experts partner with our customer leaders across businesses and technologies to accelerate their journey and become a data-driven SaaS business. In this session, we cover the AWS Data-Driven Everything (D2E) framework that helps customers to build a compelling vision of a data-driven business and accelerate the ability to productize their data.

    Speaker: Jason Hunter, Analytics Platform Specialist, AWS
    Duration: 30mins

    Unlock the value of third-party data with AWS Data Exchange (Level 100)
    The ability to transform the ‘currency’ of data in all its various forms into actionable insights is critical in a rapidly changing digital world. In this session, we cover how AWS Data Exchange (ADX) makes it easy to securely find, subscribe to, and use third-party data in the cloud. With ADX, you can streamline all third-party data consumption, from existing subscriptions—which you can migrate at no additional cost to you or the provider—to future data subscriptions, all in one place. ADX enables you to easily download a data set or copy it to Amazon S3 and analyze it with a wide variety of AWS analytics and machine learning services. Learn how to create data products and make them available to millions of AWS customers via AWS Marketplace, and eliminate the need to build, maintain data delivery, licensing, or billing infrastructure.

    Speaker: Fred Groen, Analytics Sales Leader, AWS
    Duration: 30mins

    Modernize customer experience with AI and machine learning (Level 100)
    As customers demand greater choice in how they connect, facilitating a seamless experience across all fronts has become critical for success. Learn how AWS transforms customer experience design and execution by offering options to quickly and easily access disparate data sources in real time to offer seamless experience. In this session, we profile how data access allows a complete rethinking of customer experience (CX) by predicting why customers are reaching out. We eliminate the interactive voice response (IVR) with hyper-personalized interaction and enable automation to make customer and service communication more effective and efficient for everyone.

    Speaker: Simon Burke, Principal Connect Specialist, AWS
    Duration: 30mins

    Solving business challenges with your data (Level 100)
    Business impact driven by data insights has deeply implicated greater industry and citizen efficiencies. Discover the advantages of how Intel’s compute, powers AWS platforms to deliver better price performance for data analytics workloads while achieving faster time to data insights.

    Speaker: Akanksha Balani, AWS APJ Alliance Head, Global AI & HPC GTM Lead, Intel
    Duration: 30mins

  • Data-driven organizations track 2
  • Data-driven organizations track 2

    Data-driven organizations track 2

    About the track

    Get inspired and learn how organizations are using AWS to solve business challenges, optimize business performance, and innovate faster. Start leveraging your data as a strategic asset and reinvent your organization with data today.

    Realizing the value of FSI data on AWS (Level 100)
    Financial services institutions are racing to transform their business, innovate, and stay ahead of their competition. They are also required to meet evolving regulatory requirements, manage risk, and build consumer confidence by preventing fraud. Building the right infrastructure to harness the existing data and extract insights are essential to provide an omni-channel and personalized customer's experiences and meet these challenges. Join Shivani Venkatesh - Head of Insights, Analytics & AI CoE, RBL bank, and learn how RBL bank has created an "Early Warning System" for their large loan accounts as well as used a series of cloud-first digital tools to strengthen their customer engagement.

    Speaker: Shivani Venkatesh, Head of Insights, Analytics & AI CoE, RBL Bank
    Duration: 30mins

    Modernize government analytics and unlock digital transformation (Level 100)
    Organizations across the globe including government, education, nonprofit, and healthcare are using data to make informed decisions. In this session, learn how organizations are using AWS cost-effectively for their growing pools of data and paving the way for innovation in world-changing projects.

    Speaker: Eric Conrad, Regional Managing Director, ASEAN Public Sector, AWS
    Duration: 30mins

    Customer 360 in retail with AWS (Level 200)
    Consumer behaviors and expectations have fundamentally changed, compelling retailers to accelerate digital transformation across the value chain. Consumers do not shop linearly, they use many different, disparate channels along their shopping journey to discover and research products—from social media and websites, to email marketing campaigns and targeted ads, as well as actually shopping in a brick-and-mortar store. In this session, learn how to get a 360-degree view on your customers including their buying behaviors and preferences so as to tailor their experiences along the purchase journey. Get insights on how AWS empowers you to create the exceptional experiences built for the future of retail.

    Speaker: Pierre Semaan, Head of SMB Solutions, AWS
    Duration: 30mins

    Optimizing industrial operations and simplify your digital twin journey (Level 200)
    Customers are seeking to deploy digital twins to improve operations, build innovative products, and enhance business value across industries such as manufacturing, utilities, smart buildings, and more. Building and managing digital twins are time consuming and complex, requiring a scalable workflow to integrate and contextualize data from disparate sources. This session focuses on how AWS IoT TwinMaker makes it faster and easier for customers to drive business value by creating and using digital twins of real-world systems like buildings, factories, industrial equipment, and production lines. Learn how to use data from multiple sources to create virtual representations of any physical environment, and combine existing 3D models with real-world data. Get insights on how to harness digital twins to create a holistic view of your operations, products, and spaces faster and with lower effort.

    Mirela Juravle, IoT GTM Specialist, AWS
    Andra Christie, Senior Solution Architect, IoT & Digital Twin, AWS

    Duration: 30mins

    How Daikin Malaysia build differentiated air conditioners with AWS IoT (Level 100)
    As the use of smart devices continues to grow, more and more data is being pushed to the cloud, where the latest IoT technologies is enabling new innovations for these connected home applications. In this session, learn how Daikin Malaysia embarks on their digital transformation journey to build their smart air conditioner platform on AWS. Find out how AWS IoT services have enabled Daikin Malaysia to harness data from their air conditioners and delivered value for both their businesses and consumers. Gain insights on how Daikin Malaysia has created a new class of smart services that builds long term trust and relationship with their customers.

    Lindon Chen, GTM Specialist, IoT & Robotics, APJ, AWS
    Teoh Kuang Yee, Senior Manager, Platform Development, Business Improvement, Daikin Malaysia

    Duration: 30mins

    Improving live streaming user experience with data analytics (Level 200)
    In the video live streaming world, it is important that audiences get the best user experience while watching their favorite content on different devices and network conditions. This session explains how we can make use of data analytics for live streaming to work backwards and improve live streaming user experience.

    Sumit Patel, Solutions Architect, AWS
    Christer Whitehorn, Lead Solutions Architect - Media Services, AWS

    Duration: 30mins

    Creating a modern data strategy to enable data sharing in regulated environment (Level 200)
    Customers that are heavily regulated with sensitive data such as financial services institutions and healthcare often have multiple entities inside a single organization. These entities need to share data internally, across departments, and even publicly. This presents both a technical and governance challenge on how to deliver business outcomes that require access to data across this complex ecosystem. In this session, we explain how to get started in building a modern data strategy that allows you to quickly deliver insights while iterating on the required governance and technical frameworks for future use cases.

    Speaker: Blair Layton, Principal Business Development Manager, AWS 
    Duration: 30mins

  • Data movement, processing, management, and governance
  • Data movement, processing, management, and governance

    Data movement, processing, management, and governance

    About the track

    Gain best practices and concepts around data movement, eliminating data silos, and analyzing diverse datasets easily while keeping your data secure. Find out how to easily capture and centralize your data in a quick, cost-effective, and secure fashion using AWS.

    From store to explore - Build a data flywheel for value (Level 100)
    From data movement to storage, management, processing, and analytics, AWS provides purpose-built services with agility, scalability, and cost-effectiveness. In this introductory session, get insights on how a data flywheel brings delightful customer experience with data-driven decisions. Learn the AWS storage and data transfer services to eliminate data silos, and how data processing and governance services help build a modern data architecture.

    Speaker: Lily Jang, Senior Storage Specialist, AWS
    Duration: 30mins

    Black belt tips: Operating at scale with your data in an Amazon S3 data lake (Level 200)
    Simplicity in access, observability, monitoring, automation, protecting data, and cost optimization are key focus areas for customers, especially when operating with data at scale of petabytes. In this session, learn from an Amazon Simple Storage Service (Amazon S3) expert to operate at scale with your data and receive black belt tips to do more with less time. We cover topics related to your Amazon S3 data lake such as observability and monitoring through Amazon S3 Storage Lens, Amazon CloudWatch metrics and dashboards for Amazon S3, using objects in Amazon S3 to trigger automated workflows, and seamless cost optimization using Amazon S3 Intelligent-Tiering. We also discuss options to protect your data assets for different use-case, using Amazon S3 Object Lock, Amazon S3 Replication, and AWS Backup.

    Speaker: Wali Akbari, Principal Storage Solutions Architect, AWS
    Duration: 30mins

    Easy ways to migrate petabytes of data to Amazon S3 using AWS DataSync (Level 200)
    Organizations are often faced with challenges in migrating vast amounts of data efficiently and effectively from their on-premises data storage environments to AWS. Planning and moving petabytes of HDFS data is a massive task to execute. In this session, learn simple steps to migrate and ingest data at scale to Amazon S3. We share best practices to build the right architecture on AWS for AWS DataSync service to migrate data in a faster, more secured, and cost-effective manner. We also explain how AWS DataSync can do all the heavy lifting and help ingest the data with ease to Amazon S3.

    Speaker: Ameen Khan, Storage Specialist Solutions Architect, AWS
    Duration: 30mins

    Turn streaming data into real-time insights with AWS serverless technologies (Level 200)
    There is increasing need to deal with an ever-growing volume of streaming data and the challenges of turning it into near real-time insights. In this session, learn about the data cycle and how to gain insights from data by leveraging cloud-native architectures based on AWS serverless technologies.

    Speaker: Faraz Masood, Senior Cloud Architect, AWS
    Duration: 30mins

    Modernize Spark workloads with Amazon EKS for better price performance (Level 200)
    Customers are becoming more data-driven and leveraging the capabilities provided by containers for agility, portability, and flexibility. In this session, we cover how to run Spark workloads on Amazon Elastic Kubernetes Service (Amazon EKS) to improve resource utilization, cost, and price performance. We also demonstrate how easy it is to migrate your existing workloads to Amazon EMR on Amazon EKS without any application changes.

    Speaker: Melody Yang, Senior Big Data Architect, Amazon EMR, AWS
    Duration: 30mins

    Data governance at scale on AWS (Level 100) 
    Data governance is the key to a data-driven organization. The most successful data-driven organizations align their business strategy, data strategy, and their data governance to deliver value. Join this session to learn how AWS customers are succeeding in aligning people, process, and technology to deliver data governance at scale.

    Speaker: Francis McGregor-Macdonald, Manager, Analytics Specialist Solutions Architects, AWS
    Duration: 30mins

    Data protection fundamentals on AWS (Level 200)
    Protecting data is a fundamental requirement for every customer. In this session, learn how to apply AWS services to secure your data including encrypting data in-transit and at-rest. Find out how to configure appropriate access controls, monitor your data for compliance, and manage data discovery including classification tasks and layering defense-in-depth to protect against malicious behavior. The session also covers the data backup lifecycle and disaster recovery preparedness.

    Speaker: Michael Stringer, Principal Solutions Architect, Security, AWS
    Duration: 30mins

  • Migrate and modernize your databases
  • Migrate and modernize your databases

    Migrate and modernize your databases

    About the track

    In this track, find out how AWS cloud databases can help you meet your distinct use cases all while delivering operational efficiency, performance, availability, scalability, security, and compliance.

    Modernize your data infrastructure with fully managed purpose-built databases (Level 200)
    Organizations are reinventing their businesses with data, and the first step is data infrastructure modernization. AWS offers a suite of purpose-built databases such as relational, in-memory, key-value, document, time-series, and graph databases to help you at each stage of your data-driven journey. In this session, learn how you can pick the right AWS purpose-built database to meet the scale, performance, and manageability requirements when building your modern data infrastructure and applications.

    Speaker: William Wong, Senior Specialist Solutions Architect, Databases, AWS
    Duration: 30mins

    Optimize performance and scale with AWS managed databases (Level 300)
    Many organizations begin their cloud journey with a lift-and-shift or modernization of applications from on-premise to AWS. Performance gains are often the motivating factor behind a cloud migration. Organizations have been able to innovate faster and serve customers better by migrating to fully managed AWS databases. In this session, we discuss the best practices on how to optimize performance and scale seamlessly with AWS managed databases. We cover Amazon DevOps Guru for RDS, a ML-powered capability that is designed to empower developers and DevOps engineers to quickly detect, diagnose, and remediate a wide variety of database-related issues in Amazon RDS using Amazon RDS Performance Insights.

    Speaker: Roneel Kumar, Senior Relational Databases Specialist Solutions Architect, AWS
    Duration: 30mins

    Run Oracle database workloads with AWS managed database (Level 200)
    Self-managed Oracle workloads can be time-consuming. Organizations are looking at ways to free up their valuable Oracle DBA resources and shift their efforts to activities that are business enablers. The best way is to leverage the power of fully managed databases in the cloud. In this session, we share the considerations for moving self-managed Oracle Databases to Amazon RDS for Oracle and explain the most efficient way to do it. We showcase the benefits of a managed database operating model, discuss transformative cost saving techniques, and illustrate the best practice migration patterns. We also demonstrate the migration of an on-premise Oracle database to Amazon RDS for Oracle with minimal downtime. To conclude, we highlight the fundamentals in a successful migration including tools and mechanisms available to deliver the smooth transition plus additional benefits attained for using purpose-built database like Amazon Aurora Postgres.

    Roneel Kumar, Senior Relational Databases Specialist Solutions Architect, AWS
    Manash Kalita, Senior Database Solution Architect, AWS

    Duration: 30mins

    Deploy, modernize, and optimize Microsoft SQL Server on AWS (Level 300)
    There is no one-size-fits-all approach to the migration and modernization journey for the SQL Server workloads. In this session, learn about various options to run Microsoft SQL Server on AWS. We share options to lift and shift with Amazon EC2, managed experience with Amazon RDS and Amazon RDS Custom, and modernize with AWS purpose-built databases. We explain the benefits of each approach and cover different architectural patterns, strategies and best practices to migrate, optimize and modernize SQL Server workload on AWS. The session also showcases a demo on Babelfish for Aurora PostgreSQL, the new capability for Amazon Aurora PostgreSQL-Compatible Edition that enables Aurora to understand commands from applications written for Microsoft SQL Server.

    Speaker: Rita Ladda, Senior Database Solution Architect, AWS
    Duration: 30mins

    Optimize databases with AWS optimization & licensing assessment (Level 200)
    In complex datacenter exits and cloud migrations, a compelling business case is often required to justify the migrations. Lift-and-shift migrations alone would not give you the optimal database footprint on the cloud. AWS Optimization and Licensing Assessment (OLA) is a great way to start your cloud journey by right-sizing and right-licensing your databases. In this session, learn how OLA saves cost by optimizing your instance and license selection.

    Speaker: Sriwantha Attanayake, Senior Partner Solutions Architect, AWS
    Duration: 30mins

    Building high performance applications at any scale with Amazon DynamoDB (Level 200)
    NoSQL databases are purpose-built for specific data models and optimized for modern applications like mobile, web, and gaming applications that require scalability, low latency, and flexibility. Amazon DynamoDB offers an enterprise-ready database that helps you deliver apps with consistent single-digit millisecond performance and nearly unlimited throughput and storage. In this session, we share how you can save costs while driving the most business impact with multi-region replication using global tables, optimizing for cost with new Amazon DynamoDB table classes, on-demand capacity mode for spiky workloads, and exporting data from your continuous backups to Amazon S3.

    Speaker: Esteban Serna, Senior DynamoDB Specialist Solutions Architect, AWS
    Duration: 30mins

    Building high-performance and resilient real-time applications with Amazon ElastiCache and MemoryDB for Redis (Level 200)
    Today’s modern applications demand high performance and responsiveness at any scale. In this session, we share how you can build high performance applications that impact revenue, customer experience, and satisfaction using a distributed in-memory data store with Amazon ElastiCache for Redis. Amazon ElastiCache is a fully managed in-memory caching service that accelerates application performance with microsecond latency. Discover how caching can supercharge your workloads, and how to build fast, secure, and highly available applications.

    Speaker: Orlando Andico, Senior Solutions Architect, AWS
    Duration: 30mins

  • Unify your data: Accelerate insights track 1
  • Unify your data: Accelerate insights track 1

    Unify your data: Accelerate insights track 1

    About the track

    Learn the approaches, tools, and frameworks to break down data silos, unify your data, and make data more accessible to everyone who needs it and can seamlessly discover, access, and analyze the data in a secure and governed way.

    Harness the power of data and choose the right analytics for your use case (Level 100)
    With the right data strategy, organizations can control their growing data, find insights from diverse data types, and make it available to the right people and systems. To achieve this, an organization must be data-driven. In this session, learn how you can put your data to work with the best of both data lakes and purpose-built data stores on AWS. Create new innovations that can help you scale analytics through every job role, process, and application in your organization.

    Speaker: Francis McGregor-Macdonald, Manager, Analytics Specialist Solutions Architects, AWS
    Duration: 30mins

    SQL-based stream processing at scale with Apache Kafka and Apache Flink (Level 300)
    We are witnessing a rapidly growing interest in real-time data processing with streaming data infrastructures like Apache Kafka and Apache Flink. However, streaming data analysis requires a unique skill, such as knowing Java or Scala APIs, and understanding stream processing concepts like window, time, and state which are complex. On the other hand, SQL is a widely used language for data processing and is easy to learn. This session discuss how you can use SQL to process real-time data from Apache Kafka using Amazon MSK and Amazon Kinesis Data Analytics for Apache Flink.

    Speaker: Masudur Rahaman Sayem, Senior Analytics Solutions Architect, AWS
    Duration: 30mins

    Observability made easy with Amazon OpenSearch Service (Level 200)
    Distributed application management can be a challenge. In this session, we introduce Amazon OpenSearch Service which helps you to visualize your trace, log and metric data including having a comprehensive visibility into your distributed application interactions, performance, and health in real-time. We share how Amazon OpenSearch observability suite creates the foundation required to improve the mean time to detect (MTTD) and mean time to recover (MTTR) metrics, cutting through the chaos during an operational event. We also demonstrate how you can perform the observability functionality in a live environment with a sample distributed microservices application.

    Speaker: Muhammad Ali, Senior Analytics Specialist Solutions Architect, AWS
    Duration: 30mins

    Build highly scalable and extensible SQL-based data pipeline using Managed Workflow for Apache Airflow and Amazon Redshift (Level 300)
    Data is a vital element of today’s innovative organizations and its growing in volume and complexity faster than ever. Traditional data warehouses have rigid architectures that do not scale for modern big data analytics use cases. Amazon Redshift is the most widely used cloud data warehouse that uses SQL to analyze structured and semi-structured data across data warehouses, operational databases, and data lakes. Customers use Amazon Redshift to run scalable data pipelines in an ELT pattern and to orchestrate, they utilize Amazon Managed Workflow for Apache Airflow (MWAA), a highly scalable, fully managed and extensible orchestration solution based on open source Apache Airflow. In this session, learn how you can setup and execute end to end data pipelines using Amazon MWAA and Amazon Redshift to drive higher elasticity, reduce maintenance costs, and attain near real-time decision making.

    Praveen Kumar, Analytics Solutions Architect, AWS
    Duration: 30mins

    Accelerate your time-to-insight with Amazon Redshift streaming (Level 200)
    Today organizations recognize that data is an important asset. The ability to act on timely data sets data-driven organizations apart from their peers. However, gaining access to real-time data through acquiring new software and hiring specialized engineering teams require significant investment. The new Amazon Redshift streaming ingestion feature (preview) aims to democratize streaming analytics at low-cost with SQL being the only skill required to set it up. In this session, we showcase how you can build a streaming analytics application at ease with the new feature. We also demonstrate a near-real time logistics dashboard built using Amazon Managed Grafana and provide augmented intelligence including situational awareness for the logistics operations team. Find out how it connects to the Amazon Redshift cluster and use the new streaming feature to load data from Amazon Kinesis Data Streams. Instructions on how to replicate this demo in your own AWS account will also be provided.

    Speaker: Paul Villena, Specialist Solutions Architect, Analytics Acceleration Lab, AWS
    Duration: 30mins

    Unlocking analytics capability by migrating your legacy database to an Amazon Redshift data warehouse (Level 300)
    Customers are seeking to drive more value from their data and it is becoming increasingly difficult using legacy applications. Amazon Redshift, a purpose-built large-scale analytics service, provides deeper and quicker insights from your data throughout the business. In this session, we share the common scenarios we see from customers looking to migrate to Amazon Redshift and the benefits unlocked through the modernization of their analytics platform. We showcase how you can quickly and easily migrate legacy databases to Amazon Redshift for your analytics workloads.

    Sean Beath, Specialist Solutions Architect, Analytics Acceleration Lab, AWS
    Randy Chng, Specialist Solutions Architect, Analytics Acceleration Lab, AWS

    Duration: 30mins

    Unleash your organization's data analytics potential with Amazon Redshift (Level 200)
    Amazon Redshift enables you to focus on delivering your business outcomes without worrying about managing infrastructure. You can break through data silos and run real-time and predictive analytics on all your data across your operational databases, data lake, data warehouse, and thousands of third-party data sets. With features like Amazon Redshift data sharing, you can extend the ease of use, performance, and cost benefits of Amazon Redshift in a single cluster to multi-cluster deployments while being able to share data. In this session, we showcase the art of the possible with Amazon Redshift and demonstrate how to accelerate your time to insight with fast, easy, and secure analytics at scale.

    Speaker: Rick Fraser, Analytics Solutions Architect, AWS
    Duration: 30mins

  • Unify your data: Accelerate insights track 2
  • Unify your data: Accelerate insights track 2

    Unify your data: Accelerate insights track 2

    About the track

    Learn the approaches, tools, and frameworks to break down data silos, unify your data, and make data more accessible to everyone who needs it and can seamlessly discover, access, and analyze the data in a secure and governed way.

    Harness the power of data and choose the right analytics for your use case (Level 100)
    With the right data strategy, organizations can control their growing data, find insights from diverse data types, and make it available to the right people and systems. To achieve this, an organization must be data-driven. In this session, learn how you can put your data to work with the best of both data lakes and purpose-built data stores on AWS. Create new innovations that can help you scale analytics through every job role, process, and application in your organization.
    Speaker: Francis McGregor-Macdonald, Manager, Analytics Specialist Solutions Architects, AWS
    Duration: 30mins

    Simplify data integration with AWS Glue (Level 200)
    Organizations are breaking down data silos and integrating data from different systems and components to extract the maximum value of their data. AWS Glue is a serverless data integration service which makes it easy to discover, prepare, and combine data for analytics, machine learning, and application development. In this session, dive deep into new and existing features of AWS Glue. Learn how to build and run scalable, extract, transform, and load (ETL) workflows including simplifying your data integration architecture.
    Speaker: Nico Anandito, Analytics Specialist Solutions Architect, AWS
    Duration: 30mins

    Deliver insights faster with AWS Glue Interactive Sessions and continuous delivery (Level 300)
    One cannot miss out on data integration to develop an efficient business intelligence framework. In this session, we introduce the new AWS Glue Interactive Sessions and how it enhances the data engineer and developer experience to deliver insights faster. We also dive deep into how AWS Glue Interactive Sessions can play a key role in building scalable, fault-tolerant CI/CD process for your ETL jobs.

    Speaker: Niladri Bhattacharya, Senior Analytics Specialist Solutions Architect, AWS
    Duration: 30mins

    Migrating big data workloads to AWS with Amazon EMR (Level 200)
    Thousands of customers are modernizing their infrastructure for rapid innovation, higher performance, and greater reliability as part of their modern data strategy. In this session, we discuss the benefits of modernizing legacy Hadoop by migrating your big data workloads such Spark, Hive, Presto and more to Amazon EMR. We dive deep into the Amazon EMR interoperability with other AWS analytics services and share how it offers you the flexibility. Learn more about Amazon EMR migration initiative to help you with your migration journey

    Speaker: Quim Bellmunt, Analytics Specialist Solutions Architect, AWS
    Duration: 30mins

    Embedding insights within your applications with Amazon QuickSight (Level 200)
    Every day, the people in your organization make decisions that affect your business. When they have the right information at the right time, they can make the choices that move the company in the right direction. This session covers the different embedding approaches you can take to embed Amazon QuickSight within your application. We explain the use cases for embedding QuickSight dashboards; the benefits of embedding the entire QuickSight console as well as the approach to take to embed the QuickSight Q search bar and enable users to ask questions of their data.

    Speaker: Olivia Carline, QuickSight Solutions Architect, AWS
    Duration: 30mins

    Build a unified view of your data with Amazon Athena (Level 200)
    Organizations today use data stores that best fit the applications they build. However, running analytics on data spread across different applications and workloads can be complex and time consuming. Traditionally, the user often has to manually access these systems, and integrate the data locally using various data analysis or exploration tools. Learn how Amazon Athena features such as federated query and user defined functions help in building a cloud-native data virtualization layer to enable data exploration across heterogeneous sources through a centralized interface.

    Speaker: Indrajit Ghosalkar, Solutions Architect, AWS
    Duration: 30mins

    Delivering new business insights with SAP on AWS (Level 200)
    A narrow view of traditional cloud-value can limit organizations from achieving significant business outcomes. The value of cloud adoption transcends IT. Cloud is transformational for our customers when they can derive new business insights from a combination of both SAP and non-SAP data. SAP on AWS customers can leverage AWS analytics, artificial intelligence, and machine learning capabilities to get near real-time insights which move the needle of their business performance; improving operational efficiency; increasing supply chain efficiency; generating new revenue streams; and detecting and responding to business risks at a faster clip.

    Peter Perbellini, ERP on AWS Specialist Solutions Architecture Lead, APJ
    Allison Quinn, Senior Analytics Solutions Architect, AWS

    Duration: 30mins

  • Innovate with data and machine learning
  • Innovate with data and machine learning

    Innovate with data and machine learning

    About the track

    Discover the various machine learning integration services available on AWS that can help you build, deploy, and innovate at scale. We also focus on how AI services are applied to common use cases such as personalized recommendations, adding intelligence to your contact center, and improving customer experience.

    Putting machine learning in the hands of every builder with AWS databases, analytics, and ML (Level 200)
    At AWS, we aim to put machine learning (ML) in the hands of all builders. In this session, learn the different ways AWS is empowering builders with ML using services such as Amazon Aurora, Amazon Redshift, Amazon Neptune, and Amazon QuickSight to build new experiences and reimagine existing processes

    Speaker: Tom McMeekin, Enterprise Solutions Architect, AWS
    Duration: 30mins

    Analyze data and detect transaction anomalies using Amazon Athena ML (Level 200)
    Modern day organizations store business data in various data sources with multiple cross functional teams operating on the data. Amazon Athena helps developers, architects, analysts, and data engineers to analyze various data forms with standard SQL. In this session, learn how to use Amazon Athena to run machine learning inferences with SQL commands on business transactions to gain insights on the data and identify anomalous transactions.

    Speaker: Hariharan Suresh, Senior Solutions Architect, AWS
    Duration: 30mins

    Personalize customer engagements with marketing automation (Level 200)
    When it comes to customer communications, it is not surprising that personalization is the best way to ensure engagement with customers in the long run. Customers are more likely to give their attention to content that is tailored to their needs. In this session, we showcase how to use Amazon Pinpoint journeys to provide personalized multi-step customer experience based on audience attributes and behaviors; and how to use Amazon Personalize to ensure that the communication content is always specific and personalized for the recipient.

    Speaker: Pierre Semaan, Head of SMB Solutions, AWS
    Duration: 30mins

    Go beyond insights to predictive analytics with Amazon Redshift ML and Amazon SageMaker Canvas (Level 200)
    Organizations are managing more data than ever before, and data use is only continuing to expand. Harnessing data to reinvent a business while challenging, is imperative to staying relevant now and in the future. To go beyond insights to predictions, Amazon Redshift ML brings machine learning to the data in your data warehouse, with familiar SQL command. In this session, learn how to create machine learning models with Amazon Redshift and Amazon SageMakers Canvas - a visual, no code ML capability for data analysts and make predictions from your data in Amazon Redshift without moving data or learning a new skill.

    Speaker: Mary Law, Senior Manager Analytics, APJ Acceleration Lab, AWS
    Duration: 30mins

    Build an engaging audience experience with media analytics (Level 200)
    Direct-to-consumer is a business model and success of it relies on rich viewers' experience. Capturing viewers' experience when delivering valuable media content is important. Tracking the quality of services help providers to increase their audience experience. In this session, we review an approach to gain insights into customers' video streaming experience from both client and server perspectives using AWS real-time streaming services. In addition, we explore options to build dynamic viewer experience using AWS AI services.

    Vikram Shitole, Prototyping Architect, AISPL
    Jeeri Deka, Associate Solutions Architect, AWS

    Duration: 30mins

    Improving customer experience using Amazon Connect with data analytics and ML (Level 200)
    Driving efficiency across customer interactions from call center requires analysis of data that is generated from conversations. In this session, we dive deep into how Amazon Connect and the wider AWS ecosystem of data and analytics services can deliver better customer experiences and valuable insights. We showcase how you can create dashboards to analyze call quality, perform advanced analytics by tapping into contact center data, and empower agents by ingesting data to solve customer concerns.

    Speaker: Sumit Patel, Solutions Architect, AWS
    Duration: 30mins

    Simplify customer purchase intent predictions with analytics and ML (Level 200)
    Companies are integrating AI/ML solutions to their business to stay ahead of competition. However, machine learning can be hard and often requires specialized skillet. It begins with collecting and preparing the data, followed by building, training the machine learning models before deploying it. Even choosing an algorithm to build the model can be tough. Which algorithm or machine learning model should you pick? How can you reliably figure out which model will perform the best based on your business problem? How to do hyper parameter tuning to get the best out of the model? In this session, we explain how to simplify machine learning lifecycle on purchase intent prediction using Amazon SageMaker Autopilot combined with AWS analytics services.

    Kamal Machanda, Solutions Architect, AISPL
    K V, Sureshkumar, Prototyping Architect, AISPL

    Duration: 30mins

  • Closing
  • Closing


    Accelerate rapid innovation with data (Level 200)
    To make decisions quickly, organizations want to store any amount of data in open formats, break down disconnected data silos, empower people to run analytics or machine learning using their preferred tool or technique, and manage who has access to specific pieces of data with the proper security and data governance controls. This session provides a recap of the days' sessions and addresses some of the commonly asked questions related to data analytics and machine learning. Learn how AWS is freeing organizations and builders to solve real-world business problems in any industry and innovate with confidence. Uncover how technology like machine learning and analytics can unlock opportunities that were either too difficult or impossible to do before, enabling organization with insights, transforming industries and reshaping how customers consume and engage with products and services.

  • Builders Zone
  • Builders zone

    Builders Zone

    About the zone
    Dive deep into technical stacks, learn how AWS experts have helped solve real-world problems for customers, try out these demos with step-by-step guides, and walk away with the ability to implement these or similar solutions in your own organization.

    Jumpstart your data product with embedded analytics and data lake (Level 200)
    Organizations look to provide their application users more value from their data. Embedded analytics provides a quick way to create feature-rich data products with advanced analytics. Data lake helps to scale data analytics workloads to accommodate the growing number of data providers, consumers, and data volume. This session demonstrates how to build a multi-tenant analytics dashboard with Amazon QuickSight, embed it in a web application, and scale the analytics workload with data lake.

    AWS services: Amazon QuickSight, Amazon Athena, AWS Lake Formation, Amazon S3, Amazon API Gateway and AWS Lambda
    Speaker: Frank Tan, Solutions Architect, AWS

    Real-time analytics at the edge and in the cloud (Level 200)
    Machine failures can cause an adverse impact on the operational efficiency of plants and factories, but identification of critical failures and examining physical parameters pose a challenge. To improve the fault detection process, it is crucial to monitor production systems and collect performance data in real-time. In this session, we discuss and demonstrate various options available to securely connect and collect equipment data to gain real-time insights at the edge and in the cloud using AWS IoT suite of services and Analytical services. In addition, we demo a use case where data from multiple equipment are collected and critical parameters are monitored in real time at the edge. Furthermore, we showcase a centralized dashboard with consolidated data from multiple sites.

    AWS services: AWS IoT SiteWise, AWS IoT Greengrass, Amazon S3, Amazon EC2, Amazon Timestream

    Santhosh Urukonda, Prototyping Architect, AISPL
    Sakthi Srinivasan, Prototyping Manager, AISPL

    Automating claims adjudication workflow using Amazon Comprehend Medical and Amazon Textract (Level 200)
    When a medical claim is submitted, the insurance provider must process the claim to determine the correct financial responsibility of the insurance provider and the patient. The process to determine this is broadly known as claims adjudication. It involves creating a claims processing workflow that checks each claim for authenticity, correctness, and validity based on coverage. Some of the steps in this workflow involve working with unstructured data which requires manual steps in the workflow to extract the information buried in the unstructured notes. To process volumes of claims in a cost effective and scalable manner, healthcare payers are increasingly looking at machine learning to reduce dependency on humans and rely on automation as much as possible. Additionally, analyzing and interpreting health claim data is powerful in driving improvements in population health to address issues related to cost, quality and outcomes. According to the CDC report analyzing claim documents will help identify certain behaviors would help in preventing or delaying the development of a medical condition. AWS provides a comprehensive list of machine learning and analytics services that allow developers, irrespective of their background, to start integrating machine learning and analytics technology into their applications. Through this session, we demonstrate how we can use two AWS AI services, Amazon Textract and Amazon Comprehend Medical to automate the claims adjudication workflow and run analytics on top to extract entities using Amazon Athena.

    AWS services: Amazon Textract, Amazon Comprehend Medical and Amazon Athena
    Speaker: Joinal Ahmed, Associate Solutions Architect, AISPL

    Build an organization’s financial forecast (Level 200)
    Today businesses from fast paced startups to large enterprises and traditional businesses generate huge amount of sequential data points in unit of time. Organizations need mechanism to predict patterns and future time series data, looking at historical data to other variables. Machine learning can be used to forecast any time series data and serve use cases such as retail demand, manufacturing demand, travel demand, revenue and budget planning, IT capacity, logistics, price prediction, web traffic and more. In this session, we look at how developers can leverage and build an organization’s financial metrics (sales, expense, profits etc.) forecast solution with the help of Amazon Forecast and other AWS technologies. Learn how developers with no prior ML experience can build sophisticated forecasting model that uses machine learning to combine time series data and additional data variables.

    AWS services: Amazon Forecast, AWS Lambda, AWS Step Functions, Amazon S3, Amazon Athena, AWS Glue, Amazon QuickSight, Amazon API Gateway, Amazon Cognito, AWS Security Token Service, Amazon Identity and Access Management (IAM), Amazon Simple Notification Service (Amazon SNS)
    Speaker: Darshit Vora, Startup Solutions Architect, AISPL

    Improving customer experience with Amazon Lex Automated Chatbot Designer (Level 200)
    By leveraging out-of-the-box machine learning services, learn how AWS is improving customer support functions and call centers by automating conversation transcriptions of record to text log files, analyzing logs to generate accurate conversation flows and creating engaging real-interaction chatbots within a day. Generating these conversation flows helps increase customer satisfaction, automate recurrent calls previously operated by employees, and provide the quickest resolution time for frequent customer’s issues.

    AWS services: Amazon Lex, Amazon Transcribe

    Charles Crouspeyre, Prototyping Lead, ASEAN, AWS
    Mirash Gjolaj, Prototyping Architect, AWS

    Detecting power theft using Amazon OpenSearch and smart meters data in near real-time (Level 200)
    Power distribution networks are hundreds of kilometers long. Power theft is a serious issued faced by power supply companies. It is practically difficult and challenging to physically inspect and detect power theft. By detecting this in a timely manner, we can help save substantial amounts of money. In this session, we discuss the use of smart meter data to effectively detect anomalous consumption behavior, using Amazon OpenSearch. We also showcase how Amazon Kinesis is used to ingest and pre-process the data.

    AWS services: Amazon OpenSearch, Amazon Kinesis Data Firehose, Amazon Kinesis Data Streams, Amazon S3 and Amazon Simple Notification Service (Amazon SNS), AWS Glue
    Speaker: Ajinkya Chavan, Senior STAM, Analytics, AISPL

  •  Korean
  •  Japanese
  •  Bahasa Indonesia

Session levels designed for you

Level 100

Sessions are focused on providing an overview of AWS services and features, with the assumption that attendees are new to the topic.

Level 200

Sessions are focused on providing best practices, details of service features and demos with the assumption that attendees have introductory knowledge of the topics.

Level 300

Sessions dive deeper into the selected topic. Presenters assume that the audience has some familiarity with the topic, but may or may not have direct experience implementing a similar solution.

Conference timings

Featured speakers

Dean Samuels, Chief Technologist, APJ, AWS

Dean Samuels
Chief Technologist, APJ, AWS


Kris Howard. Head of Developer Relations, APJ, AWS

Kris Howard
Head of Dev Relations, APJ, AWS


Olivier Klein, Chief Technologist, APJ, AWS

Olivier Klein
Chief Technologist, APJ, AWS


Featured customer speakers

David R. Hardoon,  Chief Data & AI Officer, UnionBank of the Philippines

David R. Hardoon
Chief Data & AI Officer, UnionBank of the Philippines


Shivani Venkatesh Head of Insights, Analytics & AI CoE, RBL Bank

Shivani Venkatesh
Head of Insights, Analytics & AI CoE, RBL Bank


Teoh Kuang Yee Senior Manager, Platform Development and Business Improvement, Daikin Malaysia

Teoh Kuang Yee
Senior Manager, Platform Development and Business Improvement, Daikin Malaysia


Learn more about Data on AWS

Leader in Gartner Magic Quadrant for
Cloud Database Management Systems.

3x faster with Amazon EMR than standard Apache Spark


faster with Amazon EMR than standard Apache Spark

200,000+ data lakes run on AWS


data lakes run on AWS

3x better price performance than other cloud data warehouses.


better price performance than other cloud data warehouses

550,000+ databases migrated to AWS.


databases migrated to AWS

100,000+ customers use AWS for machine learning


customers use AWS for machine learning

200+ fully featured services for a wide range of technologies, industries, and use cases


of data durability

Frequently Asked Questions

Start building your skills with AWS Free Tier

Get familiar with AWS products and services by signing up for an AWS account and enjoy free offers for Amazon EC2, Amazon S3, Amazon Redshift and over 100 AWS services.
View AWS Free Tier Details »

Olivier is a hands-on technologist with more than 10 years of experience in the industry and has been helping customers build resilient, scalable, secure, and cost-effective applications and create innovative and data-driven business models. He advises on how emerging technologies in the AI, ML, and IoT spaces can help create new products, make existing processes more efficient, provide overall business insights, and leverage new engagement channels for consumers.


Kristine has twenty years of experience helping companies build as a software engineer, business analyst, and team director. She is a frequent speaker at tech events and meetups including AWS Summits and TEDx Melbourne. Kristine is dedicated to meeting and working with developers across the region, and now heads up Developer Relations for AWS in APJ.


Dean comes from an IT infrastructure background and has extensive experience in infrastructure virtualization and automation. He has been with AWS for the past ten years and has had the opportunity to work with businesses of all sizes and industries. Dean is committed to helping customers design, implement, and optimize their application environments for the public cloud to allow them to become more innovative, agile, and secure.