Main Content

edge

Classification edge

Description

E = edge(tree,TBL,ResponseVarName) returns the classification edge for tree with data TBL and classification TBL.ResponseVarName.

E = edge(tree,X,Y) returns the classification edge for tree with data X and classification Y.

E = edge(___,Name,Value) computes the edge with additional options specified by one or more Name,Value pair arguments, using any of the previous syntaxes. For example, you can specify observation weights.

Input Arguments

expand all

Trained classification tree, specified as a ClassificationTree or CompactClassificationTree model object. That is, tree is a trained classification model returned by fitctree or compact.

Sample data, specified as a table. Each row of TBL corresponds to one observation, and each column corresponds to one predictor variable. Optionally, TBL can contain additional columns for the response variable and observation weights. TBL must contain all the predictors used to train tree. Multicolumn variables and cell arrays other than cell arrays of character vectors are not allowed.

If TBL contains the response variable used to train tree, then you do not need to specify ResponseVarName or Y.

If you train tree using sample data contained in a table, then the input data for this method must also be in a table.

Data Types: table

Data to classify, specified as a numeric matrix. Each row of X represents one observation, and each column represents one predictor. X must have the same number of columns as the data used to train tree. X must have the same number of rows as the number of elements in Y.

Data Types: single | double

Response variable name, specified as the name of a variable in TBL. If TBL contains the response variable used to train tree, then you do not need to specify ResponseVarName.

If you specify ResponseVarName, then you must do so as a character vector or string scalar. For example, if the response variable is stored as TBL.Response, then specify it as 'Response'. Otherwise, the software treats all columns of TBL, including TBL.ResponseVarName, as predictors.

The response variable must be a categorical, character, or string array, logical or numeric vector, or cell array of character vectors. If the response variable is a character array, then each element must correspond to one row of the array.

Data Types: char | string

Class labels, specified as a categorical, character, or string array, a logical or numeric vector, or a cell array of character vectors. Y must be of the same type as the classification used to train tree, and its number of elements must equal the number of rows of X.

Data Types: categorical | char | string | logical | single | double | cell

Name-Value Arguments

Specify optional pairs of arguments as Name1=Value1,...,NameN=ValueN, where Name is the argument name and Value is the corresponding value. Name-value arguments must appear after other arguments, but the order of the pairs does not matter.

Before R2021a, use commas to separate each name and value, and enclose Name in quotes.

Observation weights, specified as the comma-separated pair consisting of 'Weights' and a numeric vector or the name of a variable in TBL.

If you specify Weights as a numeric vector, then the size of Weights must be equal to the number of rows in X or TBL.

If you specify Weights as the name of a variable in TBL, you must do so as a character vector or string scalar. For example, if the weights are stored as TBL.W, then specify it as 'W'. Otherwise, the software treats all columns of TBL, including TBL.W, as predictors.

If you supply weights, edge computes the weighted classification edge. The software weights the observations in each row of X or TBL with the corresponding weight in Weights.

Data Types: single | double | char | string

Output Arguments

expand all

Classification edge, returned as a scalar representing the weighted average value of the margin.

Examples

Compute the classification margin and edge for the Fisher iris data, trained on its first two columns of data, and view the last 10 entries:

load fisheriris
X = meas(:,1:2);
tree = fitctree(X,species);
E = edge(tree,X,species)

E =
    0.6299

M = margin(tree,X,species);
M(end-10:end)
ans =
    0.1111
    0.1111
    0.1111
   -0.2857
    0.6364
    0.6364
    0.1111
    0.7500
    1.0000
    0.6364
    0.2000

The classification tree trained on all the data is better.

tree = fitctree(meas,species);
E = edge(tree,meas,species)

E =
    0.9384

M = margin(tree,meas,species);
M(end-10:end)
ans =
    0.9565
    0.9565
    0.9565
    0.9565
    0.9565
    0.9565
    0.9565
    0.9565
    0.9565
    0.9565
    0.9565

More About

expand all

Extended Capabilities

See Also

| | |