Amazon Web Services

In this comprehensive video, AWS expert Emily Webber explores prompt engineering and fine-tuning techniques for pre-trained foundation models. She covers zero-shot, single-shot, and few-shot prompting, as well as instruction fine-tuning and parameter-efficient methods. The video includes a hands-on demonstration using SageMaker JumpStart to fine-tune GPT-J 6B on SEC filing data, showcasing the power of these techniques for various NLP tasks like summarization, classification, and translation. Webber emphasizes the importance of using instruction-tuned models and provides practical tips for improving model performance through prompt engineering and fine-tuning. This video is an essential resource for developers and data scientists looking to leverage generative AI capabilities on AWS.

product-information
skills-and-how-to
generative-ai
ai-ml
gen-ai
Show 4 more

Up Next

VideoThumbnail
18:11

Building Intelligent Chatbots: Integrating Amazon Lex with Bedrock Knowledge Bases for Enhanced Customer Experiences

Nov 22, 2024
VideoThumbnail
21:56

The State of Generative AI: Unlocking Trillion-Dollar Business Value Through Responsible Implementation and Workflow Reimagination

Nov 22, 2024
VideoThumbnail
1:19:03

AWS Summit Los Angeles 2024: Unleashing Generative AI's Potential - Insights from Matt Wood and Industry Leaders

Nov 22, 2024
VideoThumbnail
50:05

Unlocking Business Value with Generative AI: Key Use Cases and Implementation Strategies

Nov 22, 2024
VideoThumbnail
15:41

Simplifying Graph Queries with Amazon Neptune and LangChain: Harnessing AI for Intuitive Data Exploration

Nov 22, 2024