updateMetricsAndFit
Update performance metrics in naive Bayes incremental learning classification model given new data and train model
Since R2021a
Description
Given streaming data, updateMetricsAndFit
first evaluates the performance of a configured naive Bayes classification model for incremental learning (incrementalClassificationNaiveBayes
object) by calling updateMetrics
on incoming data. Then updateMetricsAndFit
fits the model to that data by calling fit
. In other words, updateMetricsAndFit
performs prequential evaluation because it treats each incoming chunk of data as a test set, and tracks performance metrics measured cumulatively and over a specified window [1].
updateMetricsAndFit
provides a simple way to update model performance metrics and train the model on each chunk of data. Alternatively, you can perform the operations separately by calling updateMetrics
and then fit
, which allows for more flexibility (for example, you can decide whether you need to train the model based on its performance on a chunk of data).
returns a naive Bayes classification model for incremental learning Mdl
= updateMetricsAndFit(Mdl
,X
,Y
)Mdl
, which is the input naive Bayes classification model for incremental learning Mdl
with the following modifications:
updateMetricsAndFit
measures the model performance on the incoming predictor and response data,X
andY
respectively. When the input model is warm (Mdl.IsWarm
istrue
),updateMetricsAndFit
overwrites previously computed metrics, stored in theMetrics
property, with the new values. Otherwise,updateMetricsAndFit
storesNaN
values inMetrics
instead.updateMetricsAndFit
fits the modified model to the incoming data by updating the conditional posterior mean and standard deviation of each predictor variable, given the class, and stores the new estimates, among other configurations, in the output modelMdl
.
Examples
Update Performance Metrics and Train Model on Data Stream
Create a naive Bayes classification model for incremental learning by calling incrementalClassificationNaiveBayes
and specifying a maximum of 5 expected classes in the data.
Mdl = incrementalClassificationNaiveBayes('MaxNumClasses',5)
Mdl = incrementalClassificationNaiveBayes IsWarm: 0 Metrics: [1x2 table] ClassNames: [1x0 double] ScoreTransform: 'none' DistributionNames: 'normal' DistributionParameters: {}
Mdl
is an incrementalClassificationNaiveBayes
model object. All its properties are read-only.
Mdl
must be fit to data before you can use it to perform any other operations.
Load the human activity data set. Randomly shuffle the data.
load humanactivity n = numel(actid); rng(1) % For reproducibility idx = randsample(n,n); X = feat(idx,:); Y = actid(idx);
For details on the data set, enter Description
at the command line.
Implement incremental learning by performing the following actions at each iteration:
Simulate a data stream by processing a chunk of 50 observations.
Overwrite the previous incremental model with a new one fitted to the incoming observations.
Store the conditional mean of the first predictor in the first class , the cumulative metrics, and the window metrics to see how they evolve during incremental learning.
% Preallocation numObsPerChunk = 50; nchunk = floor(n/numObsPerChunk); mc = array2table(zeros(nchunk,2),'VariableNames',["Cumulative" "Window"]); mu11 = zeros(nchunk,1); % Incremental fitting for j = 1:nchunk ibegin = min(n,numObsPerChunk*(j-1) + 1); iend = min(n,numObsPerChunk*j); idx = ibegin:iend; Mdl = updateMetricsAndFit(Mdl,X(idx,:),Y(idx)); mc{j,:} = Mdl.Metrics{"MinimalCost",:}; mu11(j + 1) = Mdl.DistributionParameters{1,1}(1); end
Now, Mdl
is an incrementalClassificationNaiveBayes
model object trained on all the data in the stream. During incremental learning and after the model is warmed up, updateMetricsAndFit
checks the performance of the model on the incoming observations, and then fits the model to those observations.
To see how the performance metrics and evolve during training, plot them on separate tiles.
t = tiledlayout(2,1); nexttile plot(mu11) ylabel('\mu_{11}') xlim([0 nchunk]) nexttile h = plot(mc.Variables); xlim([0 nchunk]) ylabel('Minimal Cost') xline(Mdl.MetricsWarmupPeriod/numObsPerChunk,'r-.') legend(h,mc.Properties.VariableNames) xlabel(t,'Iteration')
The plot indicates that updateMetricsAndFit
performs the following actions:
Fit during all incremental learning iterations.
Compute the performance metrics after the metrics warm-up period only.
Compute the cumulative metrics during each iteration.
Compute the window metrics after processing 200 observations (4 iterations).
Specify Observation Weights
Train a naive Bayes classification model by using fitcnb
, convert it to an incremental learner, track its performance on streaming data and fit it to the data in one call. Specify observation weights.
Load and Preprocess Data
Load the human activity data set. Randomly shuffle the data.
load humanactivity rng(1) % For reproducibility n = numel(actid); idx = randsample(n,n); X = feat(idx,:); Y = actid(idx);
For details on the data set, enter Description
at the command line.
Suppose that the data from a stationary subject (Y
<= 2) has double the quality of data from a moving subject. Create a weight variable that assigns a weight of 2 to observations from a stationary subject and 1 to a moving subject.
W = ones(n,1) + (Y <= 2);
Train Naive Bayes Classification Model
Fit a naive Bayes classification model to a random sample of half the data.
idxtt = randsample([true false],n,true);
TTMdl = fitcnb(X(idxtt,:),Y(idxtt),'Weights',W(idxtt))
TTMdl = ClassificationNaiveBayes ResponseName: 'Y' CategoricalPredictors: [] ClassNames: [1 2 3 4 5] ScoreTransform: 'none' NumObservations: 12053 DistributionNames: {1x60 cell} DistributionParameters: {5x60 cell}
TTMdl
is a ClassificationNaiveBayes
model object representing a traditionally trained naive Bayes classification model.
Convert Trained Model
Convert the traditionally trained model to a naive Bayes classification model for incremental learning. Specify tracking the misclassification error rate during incremental learning.
IncrementalMdl = incrementalLearner(TTMdl,'Metrics',"classiferror")
IncrementalMdl = incrementalClassificationNaiveBayes IsWarm: 1 Metrics: [2x2 table] ClassNames: [1 2 3 4 5] ScoreTransform: 'none' DistributionNames: {1x60 cell} DistributionParameters: {5x60 cell}
IncrementalMdl
is an incrementalClassificationNaiveBayes
model. Because class names are specified in IncrementalMdl.ClassNames
, labels encountered during incremental learning must be in IncrementalMdl.ClassNames
.
Track Performance Metrics and Fit Model
Perform incremental learning on the rest of the data by using the updateMetricsAndFit
function. At each iteration:
Simulate a data stream by processing 50 observations at a time.
Call
updateMetricsAndFit
to update the cumulative and window performance metrics of the model given the incoming chunk of observations, and then fit the model to the data. Overwrite the previous incremental model with a new one. Specify the observation weights.Store the misclassification error rate.
% Preallocation idxil = ~idxtt; nil = sum(idxil); numObsPerChunk = 50; nchunk = floor(nil/numObsPerChunk); mc = array2table(zeros(nchunk,2),'VariableNames',["Cumulative" "Window"]); Xil = X(idxil,:); Yil = Y(idxil); Wil = W(idxil); % Incremental fitting for j = 1:nchunk ibegin = min(nil,numObsPerChunk*(j-1) + 1); iend = min(nil,numObsPerChunk*j); idx = ibegin:iend; IncrementalMdl = updateMetricsAndFit(IncrementalMdl,Xil(idx,:),Yil(idx),... 'Weights',Wil(idx)); mc{j,:} = IncrementalMdl.Metrics{"ClassificationError",:}; end
Now, IncrementalMdl
is an incrementalClassificationNaiveBayes
model object trained on all the data in the stream.
Create a trace plot of the misclassification error rate.
h = plot(mc.Variables); xlim([0 nchunk]) ylabel('Classification Error') legend(h,mc.Properties.VariableNames) xlabel('Iteration')
The cumulative loss initially jumps, but stabilizes around 0.05, whereas the window loss jumps throughout the training.
Input Arguments
Mdl
— Naive Bayes classification model for incremental learning
incrementalClassificationNaiveBayes
model object
Naive Bayes classification model for incremental learning to measure the performance
of and then to fit to data, specified as an incrementalClassificationNaiveBayes
model object. You can create
Mdl
directly or by converting a supported, traditionally trained
machine learning model using the incrementalLearner
function. For more details, see the corresponding
reference page.
If Mdl.IsWarm
is false
,
updateMetricsAndFit
does not track the performance of the model. For more
details, see Performance Metrics.
X
— Chunk of predictor data
floating-point matrix
Chunk of predictor data to measure the model performance with and then fit the model
to, specified as an n-by-Mdl.NumPredictors
floating-point matrix.
The length of the observation labels Y
and the number of
observations in X
must be equal;
Y(
is the label of observation
j (row) in j
)X
.
Note
If Mdl.NumPredictors
= 0, updateMetricsAndFit
infers the number of predictors from X
, and sets the corresponding
property of the output model. Otherwise, if the number of predictor variables in the
streaming data changes from Mdl.NumPredictors
,
updateMetricsAndFit
issues an error.
Data Types: single
| double
Y
— Chunk of labels
categorical array | character array | string array | logical vector | floating-point vector | cell array of character vectors
Chunk of labels to measure the model performance with and then fit the model to, specified as a categorical, character, or string array; logical or floating-point vector; or cell array of character vectors.
The length of the observation labels Y
and the number of observations in
X
must be equal; Y(
is the label of observation j (row) in j
)X
.
updateMetricsAndFit
issues an error when one or both of these conditions are met:
Y
contains a new label and the maximum number of classes has already been reached (see theMaxNumClasses
andClassNames
arguments ofincrementalClassificationNaiveBayes
).The
ClassNames
property of the input modelMdl
is nonempty, and the data types ofY
andMdl.ClassNames
are different.
Data Types: char
| string
| cell
| categorical
| logical
| single
| double
Weights
— Chunk of observation weights
floating-point vector of positive values
Chunk of observation weights, specified as a floating-point vector of positive values.
updateMetricsAndFit
weighs the observations in X
with the corresponding values in Weights
. The size of
Weights
must equal n, the number of
observations in X
.
By default, Weights
is ones(
.n
,1)
For more details, including normalization schemes, see Observation Weights.
Data Types: double
| single
Note
If an observation (predictor or label) or weight contains at
least one missing (NaN
) value, updateMetricsAndFit
ignores the
observation. Consequently, updateMetricsAndFit
uses fewer than n
observations to compute the model performance and create an updated model, where
n is the number of observations in X
.
Output Arguments
Mdl
— Updated naive Bayes classification model for incremental learning
incrementalClassificationNaiveBayes
model object
Updated naive Bayes classification model for incremental learning, returned as an incremental learning model object of the same data type as the input model Mdl
, incrementalClassificationNaiveBayes
.
If the model is not warm, updateMetricsAndFit
does not compute
performance metrics. As a result, the Metrics
property of
Mdl
remains completely composed of NaN
values.
If the model is warm, updateMetricsAndFit
computes the cumulative and
window performance metrics on the new data X
and
Y
, and overwrites the corresponding elements of
Mdl.Metrics
. For more details, see Performance Metrics.
In addition to updating distribution model parameters, updateMetricsAndFit
performs the following actions when Y
contains expected, but unprocessed, classes:
If you do not specify all expected classes by using the
ClassNames
name-value argument when you create the input modelMdl
usingincrementalClassificationNaiveBayes
,updateMetricsAndFit
:Appends any new labels in
Y
to the tail ofMdl.ClassNames
.Expands
Mdl.Cost
to a c-by-c matrix, where c is the number of classes inMdl.ClassNames
. The resulting misclassification cost matrix is balanced.Expands
Mdl.Prior
to a length c vector of an updated empirical class distribution.
If you specify all expected classes when you create the input model
Mdl
or convert a traditionally trained naive Bayes model usingincrementalLearner
, but you do not specify a misclassification cost matrix (Mdl.Cost
),updateMetricsAndFit
sets misclassification costs of processed classes to1
and unprocessed classes toNaN
. For example, ifupdateMetricsAndFit
processes the first two classes of a possible three classes,Mdl.Cost
is[0 1 NaN; 1 0 NaN; 1 1 0]
.
More About
Bag-of-Tokens Model
In the bag-of-tokens model, the value of predictor j is the nonnegative number of occurrences of token j in the observation. The number of categories (bins) in the multinomial model is the number of distinct tokens (number of predictors).
Algorithms
Normal Distribution Estimators
If predictor variable j
has a conditional normal distribution (see the DistributionNames
property), the software fits the distribution to the data by computing the class-specific weighted mean and the biased (maximum likelihood) estimate of the weighted standard deviation. For each class k:
The weighted mean of predictor j is
where wi is the weight for observation i. The software normalizes weights within a class such that they sum to the prior probability for that class.
The unbiased estimator of the weighted standard deviation of predictor j is
Estimated Probability for Multinomial Distribution
If all predictor variables compose a conditional multinomial distribution (see the
DistributionNames
property), the software fits the distribution
using the Bag-of-Tokens Model. The software stores the probability
that token j
appears in class k
in the
property
DistributionParameters{
.
With additive smoothing [2], the estimated probability isk
,j
}
where:
which is the weighted number of occurrences of token j in class k.
nk is the number of observations in class k.
is the weight for observation i. The software normalizes weights within a class so that they sum to the prior probability for that class.
which is the total weighted number of occurrences of all tokens in class k.
Estimated Probability for Multivariate Multinomial Distribution
If predictor variable j
has a conditional multivariate
multinomial distribution (see the DistributionNames
property), the
software follows this procedure:
The software collects a list of the unique levels, stores the sorted list in
CategoricalLevels
, and considers each level a bin. Each combination of predictor and class is a separate, independent multinomial random variable.For each class k, the software counts instances of each categorical level using the list stored in
CategoricalLevels{
.j
}The software stores the probability that predictor
j
in classk
has level L in the propertyDistributionParameters{
, for all levels ink
,j
}CategoricalLevels{
. With additive smoothing [2], the estimated probability isj
}where:
which is the weighted number of observations for which predictor j equals L in class k.
nk is the number of observations in class k.
if xij = L, and 0 otherwise.
is the weight for observation i. The software normalizes weights within a class so that they sum to the prior probability for that class.
mj is the number of distinct levels in predictor j.
mk is the weighted number of observations in class k.
Performance Metrics
updateMetricsAndFit
tracks model performance metrics, specified by the row labels of the table inMdl.Metrics
, from new data only when the incremental model is warm (IsWarm
property istrue
).If you create an incremental model by using
incrementalLearner
andMetricsWarmupPeriod
is 0 (default forincrementalLearner
), the model is warm at creation.Otherwise, an incremental model becomes warm after an incremental fitting function, such as
updateMetricsAndFit
, performs both of these actions:Fit the incremental model to
Mdl.MetricsWarmupPeriod
observations, which is the metrics warm-up period.Fit the incremental model to all expected classes (see the
MaxNumClasses
andClassNames
arguments ofincrementalClassificationNaiveBayes
).
Mdl.Metrics
stores two forms of each performance metric as variables (columns) of a table,Cumulative
andWindow
, with individual metrics in rows. When the incremental model is warm,updateMetricsAndFit
updates the metrics at the following frequencies:Cumulative
— The function computes cumulative metrics since the start of model performance tracking. The function updates metrics every time you call the function and bases the calculation on the entire supplied data set.Window
— The function computes metrics based on all observations within a window determined by theMdl.MetricsWindowSize
property.Mdl.MetricsWindowSize
also determines the frequency at which the software updatesWindow
metrics. For example, ifMdl.MetricsWindowSize
is 20, the function computes metrics based on the last 20 observations in the supplied data (X((end – 20 + 1):end,:)
andY((end – 20 + 1):end)
).Incremental functions that track performance metrics within a window use the following process:
Store a buffer of length
Mdl.MetricsWindowSize
for each specified metric, and store a buffer of observation weights.Populate elements of the metrics buffer with the model performance based on batches of incoming observations, and store corresponding observation weights in the weights buffer.
When the buffer is full, overwrite
Mdl.Metrics.Window
with the weighted average performance in the metrics window. If the buffer is overfills when the function processes a batch of observations, the latest incomingMdl.MetricsWindowSize
observations enter the buffer, and the earliest observations are removed from the buffer. For example, supposeMdl.MetricsWindowSize
is 20, the metrics buffer has 10 values from a previously processed batch, and 15 values are incoming. To compose the length 20 window, the function uses the measurements from the 15 incoming observations and the latest 5 measurements from the previous batch.
The software omits an observation with a
NaN
score when computing theCumulative
andWindow
performance metric values.
Observation Weights
For each conditional predictor distribution, updateMetricsAndFit
computes the weighted average and standard deviation.
If the prior class probability distribution is known (in other words, the prior distribution is not empirical), updateMetricsAndFit
normalizes observation weights to sum to the prior class probabilities in the respective classes. This action implies that the default observation weights are the respective prior class probabilities.
If the prior class probability distribution is empirical, the software normalizes the specified observation weights to sum to 1 each time you call updateMetricsAndFit
.
References
[1] Bifet, Albert, Ricard Gavaldá, Geoffrey Holmes, and Bernhard Pfahringer. Machine Learning for Data Streams with Practical Example in MOA. Cambridge, MA: The MIT Press, 2007.
[2] Manning, Christopher D., Prabhakar Raghavan, and Hinrich Schütze. Introduction to Information Retrieval, NY: Cambridge University Press, 2008.
Version History
Introduced in R2021aR2021b: Naive Bayes incremental fitting functions compute biased (maximum likelihood) standard deviations for conditionally normal predictor variables
Starting in R2021b, naive Bayes incremental fitting functions fit
and updateMetricsAndFit
compute
biased (maximum likelihood) estimates of the weighted standard deviations for conditionally
normal predictor variables during training. In other words, for each class
k, incremental fitting functions normalize the sum of square weighted
deviations of the conditionally normal predictor
xj by the sum of the weights in class
k. Before R2021b, naive Bayes incremental fitting functions computed
the unbiased standard deviation, like fitcnb
. The currently returned weighted standard deviation estimates differ
from those computed before R2021b by a factor of
The factor approaches 1 as the sample size increases.
See Also
Objects
Functions
Commande MATLAB
Vous avez cliqué sur un lien qui correspond à cette commande MATLAB :
Pour exécuter la commande, saisissez-la dans la fenêtre de commande de MATLAB. Les navigateurs web ne supportent pas les commandes MATLAB.
Select a Web Site
Choose a web site to get translated content where available and see local events and offers. Based on your location, we recommend that you select: .
You can also select a web site from the following list:
How to Get Best Site Performance
Select the China site (in Chinese or English) for best site performance. Other MathWorks country sites are not optimized for visits from your location.
Americas
- América Latina (Español)
- Canada (English)
- United States (English)
Europe
- Belgium (English)
- Denmark (English)
- Deutschland (Deutsch)
- España (Español)
- Finland (English)
- France (Français)
- Ireland (English)
- Italia (Italiano)
- Luxembourg (English)
- Netherlands (English)
- Norway (English)
- Österreich (Deutsch)
- Portugal (English)
- Sweden (English)
- Switzerland
- United Kingdom (English)