Main Content

Configure Model for Incremental Anomaly Detection

An incremental learning model object fully specifies how functions implement incremental fitting and anomaly detection. To configure (or prepare) an incremental learning model, create one by calling the object directly, or by converting a traditionally trained model to an incremental learner model object. The following table lists the available model types, model objects for incremental anomaly detection, and conversion functions.

Model TypeModel Object for Incremental Anomaly DetectionConversion Function
Robust random cut forestincrementalRobustRandomCutForest

incrementalLearner converts a robust random cut forest model (RobustRandomCutForest).

One-class support vector machine (SVM)incrementalOneClassSVM

incrementalLearner converts a one-class SVM model (OneClassSVM).

The approach you use to create an incremental model depends on the information you have and your preferences.

  • Call object — Create an incremental model to your specifications by calling the object directly. This approach is flexible, enabling you to specify most options to suit your preferences, and the resulting model provides reasonable default values. For more details, see Call Object Directly.

  • Convert model — Convert a traditionally trained model to an incremental learner to initialize a model for incremental learning by using the incrementalLearner function. The function passes information that the traditionally trained model learned from the data.

    When you use incrementalLearner, you can specify anomaly detection options and only those training, model, and data options that are unknown during conversion. For more details, see Convert Traditionally Trained Model.

Call Object Directly

Unlike when working with other machine learning model objects, you can create an incremental anomaly detection model by calling the corresponding object directly, with little knowledge about the data. For example, the following code creates a default incremental model for anomaly detection using the robust random cut forest algorithm, and a one-class SVM incremental anomaly detection model for a data stream containing 5 predictors.

MdlRRCF = incrementalRobustRandomCutForest;
MdlOCSVM = incrementalOneClassSVM(NumPredictors=5);

If you have information about the data to specify, or you want to configure model options or anomaly detection settings, use name-value argument syntax when you call the object. (All model properties are read-only; you cannot modify them using dot notation.) For example, the following pseudocode creates a robust random cut forest model for incremental anomaly detection, specifies that the first, second, and fourth predictors are categorical, and sets the score warm-up period to 500 observations.

Mdl = incrementalRobustRandomCutForest(CategoricalPredictors=[1 2 4], ...

For more details on all options, see the Properties section of each incremental model object page.

Convert Traditionally Trained Model

incrementalLearner enables you to initialize an incremental anomaly detection model using information learned from a traditionally trained model. The converted model can calculate scores and identify anomalies. The converted model is also warm, which means estimation and performance metrics warm-up periods are not required for incremental learning.

To convert a traditionally trained model to an incremental learner, pass the model and any options specified by name-value arguments to incrementalLearner. For example, the following pseudocode initializes an incremental one-class SVM model by using all information that a one-class SVM model for anomaly detection learned from a batch of data.

Mdl = ocsvm(X);
IncrementalMdl = incrementalLearner(Mdl,Name=Value);

IncrementalMdl is an incremental learner model object associated with the machine learning objective.

Ease of incremental model creation and initialization is offset by decreased flexibility. The software assumes that fitted parameters, hyperparameter values, and data characteristics learned during traditional training are appropriate for incremental learning. Therefore, you cannot set corresponding learned or tuned options when you call incrementalLearner.

This table lists notable read-only properties of IncrementalMdl that the incrementalLearner function transfers from Mdl or infers from other values. For more details, see the output argument descriptions on each incrementalLearner function page.

Model TypePropertyDescription
AllContaminationFractionFraction of anomalies in training data
MuPredictor variable means
PredictorNamesPredictor variable names
ScoreThresholdThreshold score for anomalies
SigmaPredictor variable standard deviations
Robust random cut forestCategoricalPredictorsIndices of categorical predictors
NumLearnersNumber of robust random cut trees
NumObservationsPerLearnerNumber of observations for each robust random cut tree
One-class SVMKernelScaleKernel scale parameter for random feature expansion
LambdaRidge (L2) regularization term strength
NumExpansionDimensionsNumber of dimensions of the expanded space


If you specify StandardizeData=true when you train Mdl, IncrementalMdl is configured to standardize predictors during incremental learning, by default.

The following conditions apply to the one-class SVM model only:

  • The incremental fitting function supports ridge (L2) regularization only.

  • If you solve the objective function by using standard or average SGD ("sgd" or "asgd" for the Solver name-value argument), these conditions apply when you call incrementalLearner:

    • incrementalLearner transfers the solver used to optimize Mdl to IncrementalMdl.

    • You can specify the adaptive scale-invariant solver "scale-invariant" instead, but you cannot specify a different SGD solver.

    • If you do not specify the adaptive scale-invariant solver, incrementalLearner transfers model and solver hyperparameter values to the incremental model object, such as the learning rate LearnRate, mini-batch size BatchSize, and ridge penalty Lambda. You cannot modify the transferred properties.


[1] Bifet, Albert, Ricard Gavaldá, Geoffrey Holmes, and Bernhard Pfahringer. Machine Learning for Data Streams with Practical Example in MOA. Cambridge, MA: The MIT Press, 2007.

[2] Kempka, Michał, Wojciech Kotłowski, and Manfred K. Warmuth. "Adaptive Scale-Invariant Online Algorithms for Learning Linear Models." Preprint, submitted February 10, 2019.

See Also


Related Topics