indexcrossentropy
Syntax
Description
The index cross-entropy operation computes the cross-entropy loss between network predictions and targets specified as integer class indices for single-label classification tasks.
Index cross-entropy loss, also known as sparse cross-entropy loss, is a more memory and computationally efficient alternative to the standard cross-entropy loss algorithm. It does not require binary or one-hot encoded targets. Instead, the function requires targets specified as integer class indices. Index cross-entropy loss is particularly well-suited to targets that span many classes, where one-hot encoded data presents unnecessary memory overhead.
calculates the categorical cross-entropy loss between the formatted predictions
loss
= indexcrossentropy(Y
,targets
)Y
and the integer class indices targets
for
single-label classification tasks.
For unformatted input data, use the DataFormat
argument.
specifies options using one or more name-value arguments in addition to any combination of
the input arguments from previous syntaxes. For example, loss
= indexcrossentropy(___,Name=Value
)DataFormat="BC"
specifies that the first and second dimensions of the input data correspond to the batch and
channel dimensions, respectively.
Examples
Input Arguments
Name-Value Arguments
Output Arguments
Algorithms
Extended Capabilities
Version History
Introduced in R2024b
See Also
dlarray
| dlgradient
| dlfeval
| crossentropy
| softmax
| sigmoid
| huber
| l1loss
| l2loss