Amazon Web Services

In this comprehensive video, AWS expert Emily Webber explores prompt engineering and fine-tuning techniques for pre-trained foundation models. She covers zero-shot, single-shot, and few-shot prompting, as well as instruction fine-tuning and parameter-efficient methods. The video includes a hands-on demonstration using SageMaker JumpStart to fine-tune GPT-J 6B on SEC filing data, showcasing the power of these techniques for various NLP tasks like summarization, classification, and translation. Webber emphasizes the importance of using instruction-tuned models and provides practical tips for improving model performance through prompt engineering and fine-tuning. This video is an essential resource for developers and data scientists looking to leverage generative AI capabilities on AWS.

product-information
skills-and-how-to
generative-ai
ai-ml
gen-ai
Show 4 more

Up Next

VideoThumbnail
30:23

T3-2 Amazon SageMaker Canvasで始めるノーコード機械学習 (Level 200)

Jun 27, 2025
VideoThumbnail
31:49

T2-3 AWS を使った生成 AI アプリケーション開発 (Level 300)

Jun 27, 2025
VideoThumbnail
26:05

T4-4: AWS 認定 受験準備の進め方 AWS Certified Solutions Architect – Associate 編 後半

Jun 26, 2025
VideoThumbnail
32:15

T3-1: はじめてのコンテナワークロード - AWS でのコンテナ活用の第一歩

Jun 26, 2025
VideoThumbnail
29:37

BOS-09: はじめてのサーバーレス - AWS Lambda でサーバーレスアプリケーション開発 (Level 200)

Jun 26, 2025