Machine Learning - Research (EN)

Research Guidebook: Deep Learning on AWS

Issue link: https://read.uberflip.com/i/1301626

Contents of this Issue

Navigation

Page 8 of 9

NucleusResearch.com 9 Document Number: T147 October 2019 User profile – Enterprise Software Company A global enterprise software company that produces applications primarily for sales and service teams adopted Amazon SageMaker to manage its TensorFlow deployment. Its deep learning efforts are mainly around sentiment analysis and classifying customer interactions in order to effectively understand how different types of outreach affect the customer's likelihood to churn or buy again. As a large technology company, it has been ahead of the curve with its deep learning efforts compared to the greater market. Before SageMaker, it custom built the bulk of its TensorFlow-based deep learning infrastructure. At the start of the year, it decided to migrate the self-managed TensorFlow deployment to SageMaker where it could be managed as-a-service. While the effort is ongoing and not yet fully complete, the organization has been able to reassign 3 FTEs so far that were primarily responsible for managing the TensorFlow ecosystem. Additionally, the speed of training models is dramatically increased since SageMaker automatically distributes the compute load across multiple CPUs or GPUs in parallel. The company reported that deploying a new model with SageMaker takes less than 50 percent of the time needed to do it in a self-managed environment. User profile – Application Development Company An application development company that specializes in creating voice-integrated games playable on Amazon Alexa, the smart speaker, built a deep learning project entirely on Amazon SageMaker that recommends games to keep user engagement high. The system uses data from previous games played on the system to predictively recommend other similar games to the user. The company built its business on the AWS platform, so it chose to leverage SageMaker for the native integration with existing architecture on AWS, particularly Amazon S3 and AWS Lambda for accessing stored data and serverless compute. Additionally, since the company was already on AWS, user permissions and devops procedures had already been formalized, allowing them to avoid duplicating that effort. The parallelized model training and control- level UI made training and evaluating much faster than it would be manually. The customer estimates that using SageMaker makes model training three times faster and model deployment four times faster than manually managing the system.

Articles in this issue

Links on this page

view archives of Machine Learning - Research (EN) - Research Guidebook: Deep Learning on AWS