イベントの詳細
-
-
Hosting DeepSeek and Open Source LLMs on AWS (Vietnam)
AI
Amazon Bedrock
生成 AI
SageMaker
-
-
オンライン
Thi Nguyen | ISV Solutions Architect, AWS, Kien Nguyen | Startup Solutions Architect, AWS, Luc Tran | Account Representative - Startups, AWS
English
200 - 中級
-
-
Event will be delivered in Vietnamese
As DeepSeek-R1 has become a trending topic due to its reasoning ability, we have heard questions on how to deploy the model or its distilled models on AWS.
This topic is extended into how customers can deploy open source LLMs on AWS in general. This 1.5 hour session is exactly about that. Let's dive deeper into DeepSeek-R1 models, the possible ways to deploy them on AWS, and how to get started with those models on AWS including potential experimentation credits.
Learn about deploying DeepSeek-R1 distilled models in a cost-optimized way with Scale Down to Zero on Amazon SageMaker and with AWS Graviton4 CPU on EC2.
Join this session to learn more.
Agenda
7:00 AM UTC
Overview of DeepSeek-R1 & Opensource LLMs on AWS: Deployment options
7:45 AM UTC
Production Practices: Opensource LLMs on AWS
8:15 AM UTC
Next Steps: AWS Funding Programs and Partners