For greater accuracy and link-function choices on low- through
medium-dimensional data sets, fit a generalized linear model with a lasso
For reduced computation time on high-dimensional data sets, train a
binary, linear classification model, such as a regularized logistic
regression model, using
fitclinear. You can also
efficiently train a multiclass error-correcting output codes (ECOC) model
composed of logistic regression models using
For nonlinear classification with big data, train a binary, Gaussian
kernel classification model with regularized logistic regression using
Regularization Without Using Object
Regularized Logistic Regression Using Linear, ECOC, or Kernel Model
|Fit binary linear classifier to high-dimensional data
|Linear learner template
|Fit multiclass models for support vector machines or other classifiers
|Predict labels for linear classification models
|Fit binary Gaussian kernel classifier using random feature expansion
|Predict labels for Gaussian kernel classification model
|Linear model for binary classification of high-dimensional data
|Multiclass model for support vector machines (SVMs) and other classifiers
|Gaussian kernel classification model using random feature expansion
|Cross-validated linear model for binary classification of high-dimensional data
|Cross-validated linear error-correcting output codes model for multiclass classification of high-dimensional data
- Regularize Poisson Regression
Identify and remove redundant predictors from a generalized linear model.
- Regularize Logistic Regression
Regularize binomial regression.
- Regularize Wide Data in Parallel
Regularize a model with many more predictors than observations.
- Lasso Regularization of Generalized Linear Models
The lasso algorithm produces a smaller model with fewer predictors. The related elastic net algorithm can be more accurate when predictors are highly correlated.