The best way to get started with Kinesis Data Analytics is to get hands-on experience by building a sample application. Simply go to the Kinesis Data Analytics console and create a new Kinesis Data Analytics application. Select the demo stream we provide as input, pick a template, and edit the SQL query. You can then view the results right there in the console or load the output into Amazon Elasticsearch Service and visualize using Kibana. Within a few minutes, you will be able to deploy a complete streaming data application.


2up_webinar-desktop_orange

It's easy to get started with Kinesis Data Analytics. The how-to videos make it even easier by providing technical deep dives into common use cases and stream processing workflows. They also provide in-depth overview of the key features so you can get your job done. Follow the links below to watch the recordings:


The developer guide provides walkthroughs of example streaming data applications. Some of these examples also contain step-by-step instructions so you can try them out and gain hands-on experience.

With these example walkthroughs you will:

  • Learn what you can build with Amazon Kinesis Data Analytics
  • Gain hands-on experience launching streaming applications
  • Get baseline SQL code that you can build on
editorial_training_accreditation_lightblue
Preprocessing streams View
Basic analytics View
Advanced analytics View
Post-processing in-application streams View

by Jeff Barr | on 11 AUG 2016

As you may know, Amazon Kinesis greatly simplifies the process of working with real-time streaming data in the AWS Cloud. Instead of setting up and running your own processing and short-term storage infrastructure, you simply create a Kinesis Data Stream or Kinesis Data Firehose, arrange to pump data in to it, and then build an application to process or analyze it.

While it is relatively easy to build streaming data solutions using Kinesis Data Streams and Kinesis Data Firehose, we want to make it even easier. We want you, whether you are a procedural developer, a data scientist, or a SQL developer, to be able to process voluminous clickstreams from web applications, telemetry and sensor reports from connected devices, server logs, and more using a standard query language, all in real time!

Read More »

 

by Ryan Nienhuis | on 11 AUG 2016

This is the first of two AWS Big Data blog posts on Writing SQL on Streaming Data with Amazon Kinesis Data Analytics. In this post, I provide an overview of streaming data and key concepts like the basics of streaming SQL, and complete a walkthrough using a simple example. In the next post, I will cover more advanced stream processing concepts using Amazon Kinesis Data Analytics.

Most organizations use batch data processing to perform their analytics in daily or hourly intervals to inform their business decisions and improve their customer experiences. However, you can derive significantly more value from your data if you are able to process and react in real time. Indeed, the value of insights in your data can decline rapidly over time – the faster you react, the better.

Read More »