IT用語辞典

Glossary Terms
A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

Machine Learning - definition & overview

In this article
What is machine learning?
How does machine learning work?
Machine learning vs. artificial intelligence
Machine learning algorithms explained
Implement machine learning for IT DevSecOps with Sumo Logic
What is machine learning?
How does machine learning work?
Machine learning vs. artificial intelligence
Machine learning algorithms explained
Implement machine learning for IT DevSecOps with Sumo Logic

What is machine learning?

Machine learning is a cutting-edge programming technique used to automate the construction of analytical models and enable applications to perform specified tasks more efficiently without being explicitly programmed. Machine learning allows the system to automatically learn and increase its accuracy in task performance through experience.

Key takeaways

  • Machine learning applications are powerful because they can process large amounts of information and form complex and elaborate, yet accurate, models that would be difficult or impossible for a human operator to construct manually.
  • Machine learning algorithms vary in the types of problems they are effective at solving, their unique approach and the type of data input required to optimize their performance.
  • IT security professionals are increasingly relying on machine learning applications to facilitate real-time security monitoring of increasingly large and disparate IT infrastructure networks.
  • Sumo Logic uses machine learning and pattern recognition to analyze the millions of log files created by your technology stack, detect anomalies and outlier data, and report security issues in a timely fashion with fewer false positives.

How does machine learning work?

A machine learning application is made up of three major components:

Model - Machine learning automates the process of building analytical models. The model is the part of the system that takes inputs and generates predictions or identifications.

Parameters - If we want our analytical models to tell us anything interesting, we need data inputs to process using the model. In machine learning, those data inputs are known as parameters. A parameter is any type of signal, factor or data point that the model can use to make a decision.

Learner - Models and parameters are common to other areas of data science and predictive analytics, but it is the programmed learning capability that distinguishes machine learning applications. The learner is the part of the system that compares model-generated predictions to actual outcomes and adjusts the parameters and by extension the model itself, based on the results.

The defining feature of the system is that the final predictive model was not explicitly designed or implemented by the software developers or programmers. Instead, the system "learned" how to improve the quality of its predictions by consuming actual data. Machine learning applications are powerful because they can process large amounts of information and form complex and elaborate, yet accurate, models that would be difficult or impossible for a human operator to construct manually.

Machine learning vs. artificial intelligence

Despite advancements in computing hardware and software over the past several decades, the human mind is still the most powerful and complex technology on Earth. Machine learning and related technologies represent our best attempts at creating a program whose internal processes closely replicate what our own minds are capable of doing. While they have some similar features, the differences between machine learning, deep learning and AI should be clarified.

Artificial intelligence is a broad science that includes any technological attempt to replicate or simulate human intelligence. That includes scripted chat-bots, software applications that use logic and decision trees, as well as machine learning and deep learning applications. Artificial intelligence tools vary significantly in their technological underpinnings. They may use sophisticated technologies like machine learning, but they may also use basic logic trees with a narrow and pre-defined decision process and no element of learning.

Machine learning is an application of artificial intelligence that simulates the learning process by establishing a predictive model, analyzing its output given specified parameters, and progressively updating the model to increase its accuracy.

Deep learning - Deep learning applications process information using artificial neural networks, highly connective computer systems that were designed to mimic the biological structure of the human brain. Neural networks allow deep learning applications to process massive amounts of data in a shorter period than machine learning. Complex functions like speech and handwriting recognition, as well as picture recognition, have benefited from the application of deep learning.

Machine learning algorithms explained

In the context of machine learning, there are several types of algorithms that are commonly implemented. Machine learning algorithms vary in the types of problems they are effective at solving, their unique approach and the type of data input required to optimize their performance. The three most common types of algorithms are supervised learning, unsupervised learning and reinforcement learning.

Supervised learning algorithms

Supervised learning algorithms are characterized by the use of training data, a set of training examples that each contains several inputs and the desired output. As the training data are processed by the machine learning algorithm, a function or model is optimized that can be used to predict the output for inputs that were not present in the initial training data. Classification, regression and similarity learning are three types of supervised learning algorithms.

Unsupervised learning algorithms

While supervised learning algorithms refine a predictive model based on data with inputs and outputs, unsupervised learning algorithms are used to find structures, patterns and relationships in cases where no training data is present. Unsupervised algorithms such as grouping or clustering of data points are useful for analyzing data that has not yet been categorized or labeled.

Reinforcement learning algorithms

Reinforcement learning algorithms are quite different from supervised and unsupervised learning. They specify an environment where the software agent has a defined goal or objective and optimize a function or model that helps that software agent reach the objective. In the context of an application like Google's Alpha Zero, the environment could be a chess game and the objective could be winning the game.

Implement machine learning for IT DevSecOps with Sumo Logic

Today's machine learning software applications are becoming increasingly sophisticated in their approach, along with the ways they can be integrated with other functionalities. Machine learning is also frequently seen as a value-added feature for software products across industry verticals, including finance, healthcare and IT. IT security professionals are increasingly relying on machine learning applications to facilitate real-time security monitoring of increasingly large and disparate IT infrastructure networks.

The implementation of machine learning technology to support the security management of cloud services can reduce manual workloads for your team and streamline your incident response process. Sumo Logic uses machine learning and pattern recognition to analyze the millions of log files created by your technology stack, detect anomalies and outlier data, and report security issues in a timely fashion with fewer false positives.

Complete visibility for DevSecOps

Reduce downtime and move from reactive to proactive monitoring.