It's easy to get started with Amazon Kinesis. This page provides a compilation of the top resources so you can launch your streaming application quickly.


Amazon Kinesis Video Streams makes it easy to securely stream video from connected devices to AWS for analytics, machine learning (ML), and other processing. In this session, we introduce Kinesis Video Streams and its key features, and review common use cases including smart home, smart city, industrial automation, and computer vision. We also discuss how you can use the Kinesis Video Streams parser library to work with the output of video streams to power popular deep learning frameworks. Lastly, Abeja, a leading Japanese artificial intelligence (AI) solutions provider, talks about how they built a deep-learning system for the retail industry using Kinesis Video Streams to deliver better shopping experience. 

Watch session recording | Download presentation

Amazon Kinesis Analytics offers a built-in machine learning algorithm that you can use to easily detect anomalies in your VPC network traffic and improve security monitoring. Join us for an interactive discussion on how to stream your VPC flow Logs to Amazon Kinesis Streams and identify anomalies using Kinesis Analytics.

Watch session recording | Download presentation

Thousands of services work in concert to deliver millions of hours of video streams to Netflix customers every day. These applications vary in size function and technology but they all make use of the Netflix network to communicate. Understanding the interactions between these services is a daunting challenge both because of the sheer volume of traffic and the dynamic nature of deployments. In this session we first discuss why Netflix chose Kinesis Streams to address these challenges at scale. We then dive deep into how Netflix uses Kinesis Streams to enrich network traffic logs and identify usage patterns in real time. Lastly we cover how Netflix uses this system to build comprehensive dependency maps increase network efficiency and improve failure resiliency. From this session youl learn how to build a real-time application monitoring system using network traffic logs and get real-time actionable insights.

Watch session recording | Download presentation

Amazon Kinesis makes it easy to collect process and analyze real-time streaming data so you can get timely insights and react quickly to new information. In this session we present an end-to-end streaming data solution using Kinesis Streams for data ingestion Kinesis Analytics for real-time processing and Kinesis Firehose for persistence. We review in detail how to write SQL queries using streaming data and discuss best practices to optimize and monitor your Kinesis Analytics applications. Lastly we discuss how to estimate the cost of the entire system. 

Watch session recording | Download presentation

Want to ramp up your knowledge of AWS big data web services and launch your first big data application on the cloud? We walk you through simplifying big data processing as a data bus comprising ingest, store, process, and visualize. You build a big data application using AWS managed services, including Amazon Athena, Amazon Kinesis, Amazon DynamoDB, and Amazon S3. Along the way, we review architecture design patterns for big data applications and give you access to a take-home lab so that you can rebuild and customize the application yourself. You should bring your own laptop and have some familiarity with AWS services to get the most from this session.

Download presentation »

In recent years, there has been an explosive growth in the number of connected devices and real-time data sources. Because of this, data is being produced continuously and its production rate is accelerating. Businesses can no longer wait for hours or days to use this data. To gain the most valuable insights, they must use this data immediately so they can react quickly to new information. In this workshop, you learn how to take advantage of streaming data sources to analyze and react in near real-time. You are presented with several requirements for a real-world streaming data scenario and you're tasked with creating a solution that successfully satisfies the requirements using services such as Amazon Kinesis, AWS Lambda and Amazon SNS.

Download presentation »

Learn how to architect a data lake where different teams within your organization can publish and consume data in a self-service manner. As organizations aim to become more data-driven data engineering teams have to build architectures that can cater to the needs of diverse users - from developers to business analysts to data scientists. Each of these user groups employs different tools have different data needs and access data in different ways.

In this talk we will dive deep into assembling a data lake using Amazon S3 Amazon Kinesis Amazon Athena Amazon EMR and AWS Glue. The session will feature Mohit Rao Architect and Integration lead at Atlassian the maker of products such as JIRA Confluence and Stride. First we will look at a couple of common architectures for building a data lake. Then we will show how Atlassian built a self-service data lake where any team within the company can publish a dataset to be consumed by a broad set of users.

Watch session recording | Download presentation

Today many architects and developers are looking to build solutions that integrate batch and real-time data processing and deliver the best of both approaches. Lambda architecture (not to be confused with the AWS Lambda service) is a design pattern that leverages both batch and real-time processing within a single solution to meet the latency accuracy and throughput requirements of big data use cases. Come join us for a discussion on how to implement Lambda architecture (batch speed and serving layers) and best practices for data processing loading and performance tuning.

Watch session recording | Download presentation

Reducing the time to get actionable insights from data is important to all businesses and customers who employ batch data analytics tools are exploring the benefits of streaming analytics. Learn best practices to extend your architecture from data warehouses and databases to real-time solutions. Learn how to use Amazon Kinesis to get real-time data insights and integrate them with Amazon Aurora Amazon RDS Amazon Redshift and Amazon S3. The Amazon Flex team describes how they used streaming analytics in their Amazon Flex mobile app used by Amazon delivery drivers to deliver millions of packages each month on time. They discuss the architecture that enabled the move from a batch processing system to a real-time system overcoming the challenges of migrating existing batch data to streaming data and how to benefit from real-time analytics.

Watch session recording | Download presentation

To win in the marketplace and provide differentiated customer experiences, businesses need to be able to use live data in real time to facilitate fast decision making. In this session, you learn common streaming data processing use cases and architectures. First, we give an overview of streaming data and AWS streaming data capabilities. Next, we look at a few customer examples and their real-time streaming applications. Finally, we walk through common architectures and design patterns of top streaming data use cases.

Watch session recording | Download presentation

In this session learn how Cox Automotive is using Splunk Cloud for real time visibility into its AWS and hybrid environments to achieve near instantaneous MTTI reduce auction incidents by 90% and proactively predict outages. We also introduce a highly anticipated capability that allows you to ingest transform and analyze data in real time using Splunk and Amazon Kinesis Firehose to gain valuable insights from your cloud resources. It's now quicker and easier than ever to gain access to analytics-driven infrastructure monitoring using Splunk Enterprise and Splunk Cloud.

Watch session recording | Download presentation


Log analytics is a common big data use case that allows you to analyze log data from websites, mobile devices, servers, sensors, and more for a wide variety of applications such as digital marketing, application monitoring, fraud detection, ad tech, gaming, and IoT. Moving your log analytics to real time can speed up your time to information allowing you to get insights in seconds or minutes instead of hours or days. In this session, you will learn how to ingest and deliver logs with no infrastructure using Amazon Kinesis Data Firehose. We will show how Kinesis Data Analytics can be used to process log data in real time to build responsive analytics. Finally, we will show how to use Amazon Elasticsearch Service to interactively query and visualize your log data.

Learning Objectives:

  1. Understand how to easily build an end to end, real time log analytics solution.
  2. Get an overview of collecting and processing data in real-time using Amazon Kinesis.
  3. Learn how to Interactively query and visualize your log data using Amazon Elasticsearch Service.

Watch recording | Download presentation

Most applications are comprised of dozens of services and hundreds of servers. These applications vary in size, function, and technology, but they all communicate with each other inside of your Amazon Virtual Private Cloud (VPC). Understanding the interactions between these applications can be a challenge both because of the volume of traffic and the dynamic nature of deployments. In this webinar, we’ll discuss how Amazon Kinesis and Amazon CloudWatch can help address these challenges at scale. We’ll discuss how to use CloudWatch Logs and Kinesis Data Streams to capture and enrich network traffic logs and identify usage patterns in real time.

Learning Objectives:

  • Understand how to build a real-time application monitoring system using network traffic logs.
  • Learn how to enrich and aggregate network flow logs data using Amazon Kinesis.
  • Learn how to visualize and analyze your network data to gain actionable insights.

Watch recording | Download presentation

Data lakes enable your employees across the organization to access and analyze massive amounts of unstructured and structured data from disparate data sources, many of which generate data continuously and rapidly. Making this data available in a timely fashion for analysis requires a streaming solution that can durably and cost-effectively ingest this data into your data lake. Amazon Kinesis Data Firehose is a fully managed service that makes it easy to prepare and load streaming data into AWS. In this tech talk, we will provide an overview of Kinesis Data Firehose and dive deep into how you can use the service to collect, transform, batch, compress, and load real-time streaming data into your Amazon S3 data lakes.

Learning Objectives:

  • Understand key requirements for collecting, preparing, and loading streaming data into data lakes.
  • Get an overview of transmitting data using Kinesis Data Firehose.
  • Learn how to perform data transformations with Kinesis Data Firehose.

Watch recording | Download presentation

spacer

Editorial_GettingStarted

You can use the following sample code and tools to quickly build, test, and deploy your analytics applications with Amazon Kinesis.

  • Use our sample IoT analytics code to build your application. No need to start from scratch. Download here »
  • Test your Kinesis application using the Kinesis Data Generator. Learn more »
  • Try a hands-on tutorial to build a log analytics solution using Kinesis. Check it out »