Amazon Web Services

This video demonstrates how to build ETL pipelines on AWS using various services. It covers three main scenarios: ingesting data from an SQL database into a data lake using AWS Data Pipeline, streaming IoT device data with Kinesis Firehose, and automating downloads from AWS Data Exchange. The presenter walks through the setup process for each pipeline, showcasing the AWS console interfaces and configuration options. Key services highlighted include AWS Data Pipeline, Kinesis Firehose, and the AWS Data Exchange Subscriber Coordinator. The video provides practical examples of how to leverage AWS tools to efficiently move and process data for analytics and other downstream uses.

product-information
skills-and-how-to
data
analytics
data-integration
Show 5 more

Up Next

VideoThumbnail
30:23

T3-2 Amazon SageMaker Canvasで始めるノーコード機械学習 (Level 200)

Jun 27, 2025
VideoThumbnail
31:49

T2-3 AWS を使った生成 AI アプリケーション開発 (Level 300)

Jun 27, 2025
VideoThumbnail
26:05

T4-4: AWS 認定 受験準備の進め方 AWS Certified Solutions Architect – Associate 編 後半

Jun 26, 2025
VideoThumbnail
32:15

T3-1: はじめてのコンテナワークロード - AWS でのコンテナ活用の第一歩

Jun 26, 2025
VideoThumbnail
29:37

BOS-09: はじめてのサーバーレス - AWS Lambda でサーバーレスアプリケーション開発 (Level 200)

Jun 26, 2025