Gartner Blog Network


Training versus Inference

by Paul DeBeasi  |  February 14, 2019  |  1 Comment

Few data-driven technologies provide greater opportunity to derive value from Internet of Things (IoT) initiatives as machine learning. The accelerated growth of data captured from the sensors in IoT solutions and the growth of machine learning capabilities will yield unparalleled opportunity for organizations to drive business value and create a competitive advantage.

An important development in machine learning is the emergence of machine learning inference servers (aka inference engines and inference servers). The machine learning inference server executes the model algorithm and returns the inference output. Refer to my blog post for more information about machine learning inference servers.

As the number of IoT endpoints proliferate, the need for organizations to understand how to design systems that integrate machine learning inference with IoT will grow rapidly. Given the fact that IoT solutions are distributed systems, a key design question is “Where should my organization deploy the machine learning inference server in the distributed IoT system?” Refer to my blog post for more information about the four options that form the foundation for creating a system design that integrates machine learning with IoT.

Machine Learning Training Versus Inference

However, before technical professionals can begin to design a system that integrates a machine learning inference server with IoT, they must understand the relationship between how IoT data can be used for machine learning model training versus inference.  Refer to the figure below to compare training versus inference.

Image comparing Machine Learning Training versus Inference

Machine Learning Training versus Inference

  • Training: Training refers to the process of creating an machine learning algorithm. Training involves the use of a deep-learning framework (e.g., TensorFlow) and training dataset (see the left-hand side of Figure). IoT data provides a source of training data that data scientists and engineers can use to train machine learning models for a variety of use cases, from failure detection to consumer intelligence.
  • Inference: Inference refers to the process of using a trained machine learning algorithm to make a prediction. IoT data can be used as the input to a trained machine learning  model, enabling predictions that can guide decision logic on the device, at the edge gateway or elsewhere in the IoT system (see the right-hand side of Figure).

New Gartner Research

New research from Gartner helps technical professionals overcome the challenge of integrating machine learning with IoT.  It analyzes four reference architectures and ML inference server technologies. IoT architects and data scientists can use this research to improve cross-domain collaboration, analyze ML integration trade-offs and accelerate system design. Each reference architecture can be used as the basis of a high-level design or can be combined to form a hybrid design.

You can view the 39 page research report here: Architecting Machine Learning With IoT.

Additional Resources

Category: architecture  internet-of-things  iot  machine-learning  

Tags: inference  iot  machine-learning  training  

Paul DeBeasi
Research VP
13 years with Gartner
34 years in IT industry

Paul DeBeasi is a distinguished VP and Chief of Research for Gartner for Technical Professionals (GTP). Mr. DeBeasi's research focuses on machine learning and IoT technical architecture. He presents these topics at IT industry events, collaborates with technical professionals and advises executive management. Read Full Bio


Thoughts on Training versus Inference


  1. SaiAnudeep says:

    Hi ! Thanks for sharing this informative post



Leave a Reply

Your email address will not be published. Required fields are marked *

Comments or opinions expressed on this blog are those of the individual contributors only, and do not necessarily represent the views of Gartner, Inc. or its management. Readers may copy and redistribute blog postings on other blogs, or otherwise for private, non-commercial or journalistic purposes, with attribution to Gartner. This content may not be used for any other purposes in any other formats or media. The content on this blog is provided on an "as-is" basis. Gartner shall not be liable for any damages whatsoever arising out of the content or use of this blog.