La traduction de cette page n'est pas à jour. Cliquez ici pour voir la dernière version en anglais.
Ajustement
Pour découvrir comment définir des options avec la fonction trainingOptions
, veuillez consulter Set Up Parameters and Train Convolutional Neural Network. Après avoir identifié de bonnes options de départ, vous pouvez automatiser le balayage d’hyperparamètres ou essayer l’optimisation bayésienne avec Experiment Manager.
Analysez la robustesse du réseau en générant des exemples contradictoires. Vous pouvez ensuite utiliser l’apprentissage antagoniste avec la méthode FGSM (Fast Gradient Sign Method) pour entraîner un réseau robuste aux perturbations antagonistes.
Applications
Deep Network Designer | Concevoir et visualiser des réseaux de Deep Learning |
Objets
trainingProgressMonitor | Monitor and plot training progress for deep learning custom training loops (depuis R2022b) |
Fonctions
trainingOptions | Options d’un réseau de neurones d’apprentissage pour le Deep Learning |
trainnet | Train deep learning neural network (depuis R2023b) |
Rubriques
- Set Up Parameters and Train Convolutional Neural Network
Learn how to set up training parameters for a convolutional neural network.
- Deep Learning Using Bayesian Optimization
This example shows how to apply Bayesian optimization to deep learning and find optimal network hyperparameters and training options for convolutional neural networks.
- Detect Issues During Deep Neural Network Training
This example shows how to automatically detect issues while training a deep neural network.
- Train Deep Learning Networks in Parallel
This example shows how to run multiple deep learning experiments on your local machine.
- Train Network Using Custom Training Loop
This example shows how to train a network that classifies handwritten digits with a custom learning rate schedule.
- Compare Activation Layers
This example shows how to compare the accuracy of training networks with ReLU, leaky ReLU, ELU, and swish activation layers.
- Deep Learning Tips and Tricks
Learn how to improve the accuracy of deep learning networks.
- Speed Up Deep Neural Network Training
Learn how to accelerate deep neural network training.
- Specify Custom Weight Initialization Function
This example shows how to create a custom He weight initialization function for convolution layers followed by leaky ReLU layers.
- Compare Layer Weight Initializers
This example shows how to train deep learning networks with different weight initializers.
- Create Custom Deep Learning Training Plot
This example shows how to create a custom training plot that updates at each iteration during training of deep learning neural networks using
trainnet
. (depuis R2023b) - Custom Stopping Criteria for Deep Learning Training
This example shows how to stop training of deep learning neural networks based on custom stopping criteria using
trainnet
. (depuis R2023b)