What does the iteration count exactly mean when training neural networks?

89 vues (au cours des 30 derniers jours)
Rah Yoonhyuk
Rah Yoonhyuk le 5 Fév 2020
i am using newff for training neural networks. My input data is an array size of 6X2000, meaning 2000 samples of 6-parameter inputs. My output is size 81X2000 meaning 2000 samples of 81 parameter output data. When I start training, the neural network toolbox automatially sets the iteration count limit to 1000. Does the 1000 iteration count mean it is training the network with the same data 1000 times?

Réponses (1)

Srivardhan Gadila
Srivardhan Gadila le 13 Fév 2020
Modifié(e) : Srivardhan Gadila le 13 Fév 2020
Iterations are calculated based on the values of MiniBatchSize, epochs mentioned in the trainingOptions and the number of training samples.
An iteration is one step taken in the gradient descent algorithm towards minimizing the loss function using a mini-batch. An epoch is the full pass of the training algorithm over the entire training set.
Iterations per epoch = Number of training samples ÷ MiniBatchSize i.e., In how many iterations in a epoch the forward and backward pass takes place during training the network.
Iterations = Iterations per epoch * Number of epochs
If number of epochs is = n then it means that the network is trained with same data n times.

Catégories

En savoir plus sur Deep Learning Toolbox dans Help Center et File Exchange

Produits


Version

R2019a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by