AWS Machine Learning Blog

Category: Artificial Intelligence

Introduction to Amazon SageMaker Object2Vec 

In this blog post, we’re introducing the Amazon SageMaker Object2Vec algorithm, a new highly customizable multi-purpose algorithm that can learn low dimensional dense embeddings of high dimensional objects. Embeddings are an important feature engineering technique in machine learning (ML). They convert high dimensional vectors into low-dimensional space to make it easier to do machine learning […]

Read More

K-means clustering with Amazon SageMaker

Amazon SageMaker provides several built-in machine learning (ML) algorithms that you can use for a variety of problem types. These algorithms provide high-performance, scalable machine learning and are optimized for speed, scale, and accuracy. Using these algorithms you can train on petabyte-scale data. They are designed to provide up to 10x the performance of the other […]

Read More

AWS expands HIPAA eligible machine learning services for healthcare customers

Today, AWS announced that Amazon Translate, Amazon Comprehend, and Amazon Transcribe are now U.S. Health Insurance Portability and Accountability Act of 1996 (HIPAA) eligible services. This announcement adds to the number of AWS artificial intelligence services that are already HIPAA eligible– Amazon Polly, Amazon SageMaker, and Amazon Rekognition. By using these services, AWS customers in […]

Read More

Now easily perform incremental learning on Amazon SageMaker

Data scientists and developers can now easily perform incremental learning on Amazon SageMaker. Incremental learning is a machine learning (ML) technique for extending the knowledge of an existing model by training it further on new data. Starting today both of the Amazon SageMaker built-in visual recognition algorithms – Image Classification and Object Detection – will […]

Read More

Direct access to Amazon SageMaker notebooks from Amazon VPC by using an AWS PrivateLink endpoint

Amazon SageMaker now supports AWS PrivateLink for notebook instances. In this post, I will show you how to set up AWS PrivateLink to secure your connection to Amazon SageMaker notebooks. Maintaining compliance with regulations such as HIPAA or PCI may require preventing information from traversing the internet. Additionally, preventing exposure of data to the public internet reduces the likelihood […]

Read More

Customize your notebook volume size, up to 16 TB, with Amazon SageMaker

Amazon SageMaker now allows you to customize the notebook storage volume when you need to store larger amounts of data. Allocating the right storage volume for your notebook instance is important while you develop machine learning models. You can use the storage volume to locally process a large dataset or to temporarily store other data to work with. […]

Read More

Lifecycle configuration update for Amazon SageMaker notebook instances

Amazon SageMaker now allows customers to update or disassociate lifecycle configurations for notebook instances with the renewed APIs. You can associate, switch between, or disable lifecycle configurations as necessary by stopping your notebook instance and using the UpdateNotebookInstance API at any point of the notebook instance’s lifespan. Lifecycle configurations are handy when you want to organize and automate the setup that is […]

Read More

Now use Pipe mode with CSV datasets for faster training on Amazon SageMaker built-in algorithms

Amazon SageMaker built-in algorithms now support Pipe mode for fetching datasets in CSV format from Amazon Simple Storage Service (S3) into Amazon SageMaker while training machine learning (ML) models. With Pipe input mode, the data is streamed directly to the algorithm container while model training is in progress. This is unlike File mode, which downloads […]

Read More

Model Server for Apache MXNet v1.0 released

AWS recently released Model Server for Apache MXNet (MMS) v1.0, featuring a new API for managing the state of the service, which includes the ability to dynamically load models during runtime, to lower latency, and to have higher throughput. In this post, we will explore the new features and showcase the performance gains of the […]

Read More

Using deep learning on AWS to lower property damage losses from natural disasters

Natural disasters like the 2017 Santa Rosa fires and Hurricane Harvey cost hundreds of billions of dollars in property damages every year, wreaking economic havoc in the lives of homeowners. Insurance companies do their best to evaluate affected homes, but it could take weeks before assessments are available and salvaging and protecting the homes can […]

Read More