Category: Amazon Machine Learning


Partner SA Roundup – July 2017

This month, Juan Villa, Pratap Ramamurthy, and Roy Rodan from the Emerging Partner SA team highlight a few of the partners they work with. They’ll be exploring Microchip, Domino, and Cohesive Networks.

Microchip Zero Touch Secure Provisioning Kit, by Juan Villa

AWS IoT is a managed cloud platform that enables connected devices to easily and securely interact with cloud applications and other devices. In order for devices to interact with AWS IoT via the Message Queue Telemetry Transport (MQTT) protocol, they must first authenticate using Transport Layer Security (TLS) mutual authentication. This process involves the use of X.509 certificates on both the devices and in AWS IoT. A single certificate contains a private and a public key component. An IoT device needs to store the private key that corresponds to its certificate in order to establish a TLS connection with mutual authentication.

Private keys can be somewhat difficult to store securely on IoT devices. It’s easy to simply store data on a device’s local memory, but this is not enough to protect the key from tampering. It’s quite easy, and affordable, to purchase the necessary hardware to read the content of the memory from most microcontrollers and memory components used on IoT devices. This means that private keys used for authentication and establishing trust need to be stored in a secure manner.

This is where a secure element chip comes in! Microchip, an APN Advanced Technology Partner, is a silicon manufacturer that makes a secure element chip called the ATECC508A. This chip has a hardware-based secure key storage mechanism that is tamper-proof. In fact, once a key is stored in the ATECC508A, its contents cannot be read. The chip accomplishes this with hardware-based cryptographic acceleration features that allow it to perform cryptographic operations very quickly and with power efficiency. When considering ATECC508A for your product, keep in mind that Microchip can preload certificates on the secure element during manufacturing, before delivery. Combining this feature with AWS IoT’s support for custom certificate authorities and just-in-time registration can streamline device provisioning and security.

To make this secure element chip easy for you to try out, Microchip makes an evaluation kit called the Zero Touch Secure Provisioning Kit. This kit includes a SAM G55 Cortex-M4 microcontroller, the ATECC508A secure element, and an ATWINC1500 power-efficient 802.11 b/g/n module, and comes with instructions on how to get started with AWS IoT. With this combination of silicon products you can begin testing and developing your next IoT product in a secure fashion.

Before you work on your next IoT project, I recommend that you consider a secure element in your design. For more information on ATECC508A, please read the datasheet on the Microchip website.

 

Domino Data Science Platform, by Pratap Ramamurthy

Machine learning, artificial intelligence, and predictive analytics are all data science techniques. Data scientists analyze data, search for insights that can be extracted, and build predictive models to solve business problems. To help data scientists with these tasks, a new set of tools, like Jupyter notebooks, as well as a wide variety of software packages ranging from deep learning neural network frameworks, like MXNet, to CUDA drivers, are becoming popular. Data science as a field is growing rapidly as companies increase their reliance on these new technologies.

However, supporting a team of data scientists can be challenging. They need access to different tools and software packages, as well as a variety of servers connected to the cloud. They want to collaborate by sharing projects, not just code or results. They want to be able to publish models with minimal friction. While data scientists want flexibility, companies need to ensure security and compliance. Companies also need to understand resource how resources like data and compute power are being used.

Domino, an APN Advanced Technology Partner, solves these challenges by providing a convenient platform for data scientists to spin up interactive workspaces using the tools that they already know and love e.g., Jupyter, RStudio, Zeppelin, as well as commercial languages like SAS and Matlab, as seen in the diagram below.

Image used with permission

In the Domino platform, users can run experiments on a wide variety of instances that mirror the latest Amazon EC2 options provided by AWS, as seen in the screenshot. Customers can run a notebook on instances with up to 2 TB of RAM with the AWS X1 instance family. If more computational power is needed, you can switch the same notebook to GPU instances as necessary or connect to a Spark cluster.

Because the software used for data science and machine learning has several layers, and new software technologies are introduced and adopted rapidly, the data science environment is often difficult to deploy and manage. Domino solves this problem by storing the notebooks, along with the software dependencies, inside a Docker image.  This allows the same code to be rerun consistently in the future. There is no need to manually reconstruct the software, and this saves valuable time for data scientists.

Domino helps data scientists share and collaborate. They have introduced the software development concepts of code sharing, peer review, and discussions seamlessly into the data science platform.

For companies that have not yet started their cloud migration, Domino on AWS makes data science an excellent first project. Domino runs entirely on AWS and integrates well into many AWS services. Customers who have stored large amounts of data in Amazon S3 can easily access it from within Domino. After training their models by using this data, they can easily deploy their machine learning model into AWS with a click of a button, and within minutes access it using an API. All of these features help data scientists focus on data science and not the underlying platform.

Today, Domino Data Science Platform is available as a SaaS offering at the Domino website. Additionally, if you prefer to run the Domino software in your own virtual private cloud (VPC), you can install the supporting software by using an AWS CloudFormation template that will be provided to you. If you prefer a dedicated VPC setting, Domino also offers a managed service offering, which runs Data Science Platform in a separate VPC. Before considering those options, get a quick feel for the platform by signing up for a free trial.

 

Cohesive Networks, by Roy Rodan

Many AWS customers have a hybrid network topology where part of their infrastructure is on premises and part is within the AWS Cloud. Most IT experts and developers aren’t concerned with where the infrastructure resides—all they want is easy access to all their resources, remote or local, from their local networks.

So how do you manage all these networks as a single distributed network in a secure fashion? The configuration and maintenance of such a complex environment can be challenging.

Cohesive Networks, an APN Advanced Technology Partner, has a product called VNS3:vpn, which helps alleviate some of these challenges. The VNS3 product family helps you build and manage a secure, highly available, and self-healing network between multiple regions, cloud providers, and/or physical data centers. VNS3:vpn is available as an Amazon Machine Image (AMI) on the AWS Marketplace, and can be deployed on an Amazon EC2 instance inside your VPCs.

One of the interesting features of VNS3 is its ability to create meshed connectivity between multiple locations and run an overlay network on top. This effectively creates a single distributed network across locations by peering several remote VNS3 controllers.

Here is an example of a network architecture that uses VNS3 for peering:

The VNS3 controllers act as six machines in one, to address all your network needs:

  • Router
  • Switch
  • SSL/IPsec VPN concentrator
  • Firewall
  • Protocol redistributor
  • Extensible network functions virtualization (NFV)

The setup process is straightforward and well-documented with both how-to videos and detailed configuration guides.

Cohesive Networks also provides a web-based monitoring and management system called VNS3:ms in a separate server, where you can update your network topology, fail over between VNS3 controllers, and monitor your network and instances’ performance.

See the  VNS3 family offerings from Cohesive Networks in AWS Marketplace, and start building your secured, cross-connected network.  Also, be sure to head over to the Cohesive Networks website to learn more about the VNS3 product family.

Why Our Customers Love Amazon Machine Learning – A Guest Post from 47Lining

Mick is CEO of 47Lining, an AWS Advanced Consulting Partner in the AWS Partner Network (APN) with the Big Data Competency designation. He holds an AWS Solutions Architect Professional certification.

Amazon Machine Learning is a service that provides predictive capabilities, the results of which can be incorporated into a wide variety of downstream applications and business processes. At 47Lining we’ve had the opportunity to partner with customers in several industry verticals to apply Amazon Machine Learning. These efforts apply predictive capabilities to optimize a wide range of operational and consumer-centric processes like establishing supply chain delivery expectations, preventing customer churn and identifying future consumer credit behaviors.

Amazon Machine Learning generates predictions from fused data sources to improve business results

As with any relatively new service or capability, customers want to understand how it is positioned in the market and how it compares to alternative choices that are available to them. As our customers have adopted Amazon Machine Learning, we listened to what they really like about it. Here’s what we’ve heard:

Amazon Machine Learning enables widespread application of predictive analytics. Amazon Machine Learning is easy to use. It can be applied by a broader array of contributors than has ever been the case. This is driving the democratization – and commoditization – of using predictive capabilities to optimize business processes. Our customers love the “gain” that Amazon Machine Learning provides to small agile teams. The visual tools provided in the AWS Console help diverse practitioners easily assess training datasets, review model quality and iteratively refine their approach. The service makes it easier for practitioners to apply machine learning to successfully enhance key metrics for high-value activities. Because the service makes such optimizations more repeatable and cost-effective, customers can scale their efforts to improve additional business processes.


Amazon Machine Learning covers a broad array of common business processes from many industries. While it is not a fit for all categories of learning problems, the learning approaches implemented today within Amazon Machine Learning allow our customers to increase efficiency in a wide array of business processes.  For example:

  • Accurately Predicting Customer Churn. 47Lining worked closely with AWS to deliver a predictive analytics engagement for a media & entertainment customer with investments in original content programming.  The engagement focused on business impact of predicting customer churn. We fused online video platform logs with 3rd party zip code demographics and social media sentiment. This data was then analyzed to extract features used to drive the learning process, such as viewing hours per quarter, daytime vs. nighttime viewing, and depth of long-tail viewing for each user. Amazon Machine Learning was used to predict customer churn with 71% accuracy.  These predictive capabilities enabled our customer to create a “sticky” conversation with their customers through relevant offers for viewer retention.  We estimate that for a typical video subscription platform, modest improvements in churn rate can yield on the order of $2-5M in annual retained subscriber revenues and avoided replacement customer acquisition costs.
  • Predicting Consumer Credit Behavior. We had the opportunity to work with RevolutionCredit, a pioneering behavioral data and analytics platform that helps creditors identify upwardly mobile customers, leading to more – and higher quality – approvals, lower delinquencies, higher retention and more engaged consumers. Based on the behavioral data from the platform, we partnered with RevolutionCredit’s Chief Scientist, Hutch Carpenter and CTO, Rama Thamman to apply Amazon Machine Learning to classify consumers and identify opportunities that benefit both consumers and creditors.  This use case represents a “high-volume optimization” scenario – because volumes are high, even modest improvements can have a large impact. For example, identifying the likelihood of write-offs through behavioral signals can provide 12% better predictive power than traditional measures of credit scores and late payments alone. For financial institutions, this improvement can represent millions of dollars in profits.
  • Lead Scoring for Propensity to Purchase Real Estate. One of our customers develops high-end real estate properties. We are helping them to develop a lead-scoring model using Amazon Machine Learning that predicts propensity to purchase high-end real estate based on a combined set of public and private data sources.  The goal is to increase the efficiency of the sales prospecting process, which results directly in a higher net-present-value return on their capital intensive construction projects.For a sizable capital project in this industry (on the order of $1 Billion), shrinking the average time to close for individual transactions by just 11% through improved targeting can improve the net-present-value of the project by over $50 Million.

These are just a few examples – as our customers continue to create wins through optimizing initial processes, additional opportunities to optimize their business operations or interactions with customers come into focus.

Amazon Machine Learning makes it easy to “go wide” during training.  One of the most important steps in producing quality machine learning models is engineering and generating the features used for training from the underlying raw data. It is not always apparent which combinations of features will lead to superior model quality and most practitioners brainstorm many alternatives, undertaking lots of empirical experimentation to understand which alternatives produce the best results. Amazon Machine Learning provides a set of feature engineering “recipes” that minimize data preparation requirements. Practitioners can use this capability to easily specify and build many models from source data inputs in parallel and quickly observe which ones work best. This conserves the precious time and energy of human practitioners by fanning out to relatively inexpensive compute resources instead, with all of the provisioning required for training handled seamlessly by the service.

Amazon Machine Learning has a really simple DevOps model.  As optimizations launch to production, Amazon Machine Learning’s simple DevOps model saves our customers real money because the service “just works”. It enables periodic batch predictions as well as elastically scaling highly available real-time prediction services. Some of our customers require batch predictions to be generated for hundreds of millions of samples each night. Others require a highly available real-time predictor service to be available with consistently low latency at global consumer scale. After we architect a solution with Amazon Machine Learning, our customers need not be concerned with operations and maintenance of underlying clusters or other infrastructure. Such an approach becomes even more important as predictive analytics are used in an increasing number of critical business processes.

Amazon Machine Learning’s predictable, elastic pricing model makes it easy to create a winning business case. The elastic pricing for Amazon Machine Learning scales with the operations of your business process. This makes business stakeholders very comfortable since up front implementation costs are small and the value of resulting optimizations typically far exceed the very predictable costs associated with the Machine Learning models and predictions that support them.

Whenever we talk with customers about the opportunities that they see to apply machine learning and predictive analytics, we grow more excited. This is a technology that we feel will reshape most business processes in most companies. We’re quite excited to be helping our customers play a leading role in this transformation and look forward to future releases of Amazon Machine Learning.

Want to learn more about 47Lining? Visit the company’s website here.


Note: The content and opinions in this blog are those of the third party author and AWS is not responsible for the content or accuracy of this post.