Amazon Web Services

In this comprehensive video, AWS machine learning specialist Emily Webber introduces the process of pretraining foundation models on AWS. She explains when and why to create a new foundation model, comparing it to fine-tuning existing models. Webber discusses the data requirements, compute resources, and business justifications needed for pretraining projects. She then delves into distributed training techniques on Amazon SageMaker, including data parallelism and model parallelism. The video concludes with a detailed walkthrough of pretraining a 30 billion parameter GPT-2 model using SageMaker's distributed training capabilities. Viewers can access accompanying notebook resources to follow along with the demonstration.

product-information
skills-and-how-to
generative-ai
ai-ml
gen-ai
Show 4 more

Up Next

VideoThumbnail
5:35

AWS WAF - Web Application Firewall protect your web applications from common web exploits

Jun 26, 2025
VideoThumbnail
16:03

Tọa đàm với anh Hiếu Trần - Co-founder của NAB Studio

Jun 26, 2025
VideoThumbnail
18:40

Thiết kế hạ tầng mạng chung trong môi trường sử dụng nhiều AWS account (Level 200)

Jun 26, 2025
VideoThumbnail
7:59

Triển khai và vận hành ứng dụng container trên môi trường nhiều AWS account (Level 300)

Jun 26, 2025
VideoThumbnail
7:06

Sử dụng Amazon S3 như thế nào? (Level 100)

Jun 26, 2025