Artificial Intelligence
Tag: HuggingFace
New performance improvements in Amazon SageMaker model parallel library
Foundation models are large deep learning models trained on a vast quantity of data at scale. They can be further fine-tuned to perform a variety of downstream tasks and form the core backbone of enabling several AI applications. The most prominent category is large-language models (LLM), including auto-regressive models such as GPT variants trained to complete […]
Train gigantic models with near-linear scaling using sharded data parallelism on Amazon SageMaker
In the pursuit of superior accuracy, deep learning models in areas such as natural language processing and computer vision have significantly grown in size in the past few years, frequently counted in tens to hundreds of billions of parameters. Training these gigantic models is challenging and requires complex distribution strategies. Data scientists and machine learning […]
Build a news-based real-time alert system with Twitter, Amazon SageMaker, and Hugging Face
Today, social media is a huge source of news. Users rely on platforms like Facebook and Twitter to consume news. For certain industries such as insurance companies, first respondents, law enforcement, and government agencies, being able to quickly process news about relevant events occurring can help them take action while these events are still unfolding. […]
Train 175+ billion parameter NLP models with model parallel additions and Hugging Face on Amazon SageMaker Distributed Training Libraries
November 2023: This post was reviewed for accuracy. The last few years have seen rapid development in the field of natural language processing (NLP). While hardware has improved, such as with the latest generation of accelerators from NVIDIA and Amazon, advanced machine learning (ML) practitioners still regularly run into issues scaling their large language models […]



