Main Content

auc

Area under ROC curve or precision-recall curve

Since R2024b

Description

a = auc(rocObj) returns the area under the ROC (receiver operating characteristic) curve.

a = auc(rocObj,type) returns the area under the ROC curve when type is "roc", and returns the area under the precision-recall curve when type is "pr".

example

[a,lower,upper] = auc(___) additionally returns the lower and upper confidence bounds on a using any of the input argument combinations in the previous syntaxes.

example

Examples

collapse all

Fit a tree and a tree ensemble to the covtype data, which has unbalanced classes. Unbalanced data is most suited to the precision-recall curve.

gunzip("https://archive.ics.uci.edu/ml/machine-learning-databases/covtype/covtype.data.gz")
load covtype.data
Y = covtype(:,end);
covtype(:,end) = [];
part = cvpartition(Y,'Holdout',0.5);
istrain = training(part); % Data for fitting
istest = test(part);
tree = fitctree(covtype(istrain,:),Y(istrain));
ens = fitcensemble(covtype(istrain,:),Y(istrain));
treeROC = rocmetrics(tree,covtype(istest,:),Y(istest),AdditionalMetrics="prec");
ensROC = rocmetrics(ens,covtype(istest,:),Y(istest),AdditionalMetrics="prec");

Compare the precision-recall AUC results for the tree and ensemble.

treePRAUC = auc(treeROC,"pr")
treePRAUC = 1×7

    0.3628    0.3865    0.3657    0.4585    0.3184    0.3840    0.2438

ensPRAUC = auc(ensROC,"pr")
ensPRAUC = 1×7

    0.7031    0.7945    0.7026    0.4995    0.2996    0.3359    0.6320

The ensemble has a better set of precision-recall AUC values.

Find the AUC for a cross-validated quadratic discriminant model of the fisheriris data, and return the bounds on the statistics. By default, cross-validated classification models create confidence intervals, so lower and upper bounds are available in areaUnderCurve.

load fisheriris
rng default % For reproducibility
Mdl = fitcdiscr(meas,species,DiscrimType="quadratic",KFold=5);
rocObj = rocmetrics(Mdl);
[a,lower,upper] = auc(rocObj)
a = 1×3

    1.0000    0.9984    0.9984

lower = 1×3

    1.0000    0.9970    0.9970

upper = 1×3

    1.0000    0.9998    0.9998

For the Fisher iris data, the AUC values are all essentially equal to 1.

Input Arguments

collapse all

Object evaluating classification performance, specified as a rocmetrics object.

Type of AUC to compute, specified as "roc" for the area under the ROC curve, or "pr" for the area under the precision-recall curve.

Data Types: char | string

Output Arguments

collapse all

Area under the curve, returned as a double or single vector, where each element of a represents the area for a class.

Lower confidence bounds on AUC, returned as a double or single vector, where each element of lower represents the confidence bound for a class. The object must be created with confidence intervals for the function to return this output.

Upper confidence bounds on AUC, returned as a double or single vector, where each element of upper represents the confidence bound for a class. The object must be created with confidence intervals for the function to return this output.

Algorithms

For an ROC curve, auc calculates the area under the curve by trapezoidal integration using the trapz function. For a precision-recall curve, auc calculates the area under the curve using the trapz function, and then adds the area of the rectangle (if any) that is formed by the leftmost point on the curve and the point (0,0). For example,

load ionosphere
rng default % For reproducibility of the partition
c = cvpartition(Y,Holdout=0.25);
trainingIndices = training(c); % Indices for the training set
testIndices = test(c); % Indices for the test set
XTrain = X(trainingIndices,:);
YTrain = Y(trainingIndices);
XTest = X(testIndices,:);
YTest = Y(testIndices);
Mdl = fitcsvm(XTrain,YTrain);
rocObj = rocmetrics(Mdl,XTest,YTest,AdditionalMetrics="ppv");
r = plot(rocObj,XAxisMetric="tpr",...
    YAxisMetric="ppv",ClassNames="b"); % Plots the normal PR curve.
legend(Location="southeast")

Performance curve showing a gap between x = 0 and the leftmost point on the curve.

There is a gap between the leftmost point on the curve and the zero point of the True Positive Rate. Plot the rectangle that fills this gap, which represents the correction that auc adds to the returned AUC.

hold on
rectangle(Position=[0 0 r.XData(2) r.YData(2)],FaceColor=r.Color)
hold off

A rectangle fills the gap between the leftmost point on the curve and 0.

Technically, the rectangle is not part of the precision-recall curve. But to make comparisons easier across models (which can have different domains of definition), auc treats the area under the curve as extending all the way down to zero.

If you create a rocmetrics object with confidence intervals (as described on the reference page), the returned AUC lower and upper arguments use the same technique for computing confidence intervals as done for the original rocmetrics object, either bootstrapping or cross-validation.

Version History

Introduced in R2024b

See Also

| |