To train a k-nearest neighbor model, use the Classification
Learner app. For greater flexibility, train a k-nearest neighbor
fitcknn in the command-line interface. After training, predict
labels or estimate posterior probabilities by passing the model and predictor data to
|Train models to classify data using supervised machine learning
|Classify observations using nearest neighbor classification model (Since R2022b)
Create Nearest Neighbor Model
Create Nearest Neighbor Searcher
Interpret Nearest Neighbor Model
|Cross-validate machine learning model
|Classification edge for cross-validated classification model
|Classification loss for cross-validated classification model
|Cross-validate function for classification
|Classification margins for cross-validated classification model
|Classify observations in cross-validated classification model
|Loss of k-nearest neighbor classifier
|Resubstitution classification loss
|Compare accuracies of two classification models using new data
|Edge of k-nearest neighbor classifier
|Margin of k-nearest neighbor classifier
|Resubstitution classification edge
|Resubstitution classification margin
|Compare accuracies of two classification models by repeated cross-validation
Gather Properties of Nearest Neighbor Model
- Train Nearest Neighbor Classifiers Using Classification Learner App
Create and compare nearest neighbor classifiers, and export trained models to make predictions for new data.
- Visualize Decision Surfaces of Different Classifiers
This example shows how to visualize the decision surface for different classification algorithms.
- Supervised Learning Workflow and Algorithms
Understand the steps for supervised learning and the characteristics of nonparametric classification and regression functions.
- Classification Using Nearest Neighbors
Categorize data points based on their distance to points in a training data set, using a variety of distance metrics.
- Speaker Identification Using Pitch and MFCC (Audio Toolbox)