Table of Contents
4
Realize Superior Business Outcomes, Developer Efficiency, and Accelerated Innovation
with High-Performance, Cost-Efficient, and Easy-to-Use ML Infrastructure
IDC White Paper, sponsored by Amazon Web Services
October 2021 | Doc. #US48194621
Customers are increasingly adopting cloud platforms to leverage scale, agility, and
the choice of infrastructure and cloud services options. IDC studies* confirm this trend,
with about 55% of the respondents citing that their AI/ML applications/solutions are
deployed on the public cloud. Customers are also increasingly leveraging hardware
accelerators for their AI/ML needs, with more than 55% of the respondents indicating
using hardware accelerators for most of their AI/ML needs. Containerized deployments
of ML models have become the standard method of model deployment — owing to
the scale, consistency, and portability they enable across platforms throughout the ML
life cycle (from experimentation to production).
This white paper discusses how AWS enables customers to accelerate their AI/ML
innovation by providing a breadth of infrastructure choices for machine learning/deep
learning (DL) training and inference needs that are optimized for price performance.
This paper also provides an overview of the Amazon EC2 DL1 instances, built from
the ground up for deep learning training, and provides recommendations on selecting
the right AWS ML infrastructure and services based on the use case.
of respondents
say their AI/ML
applications/solutions
are deployed
on the public cloud.
of respondents
indicate using
hardware accelerators
for most of their
AI/ML needs.
Increasing Adoption of Cloud Platforms
among Customers
Nearly
55%
More than
55%
* AI StrategiesView 2021