In a FeedForward NNet, what exactly is one iteration?

1 vue (au cours des 30 derniers jours)
Sam Speake
Sam Speake le 23 Mai 2018
Commenté : Greg Heath le 25 Mai 2018
When you train a feedforward neural net with no changes, you see a GUI which includes "Epoch: 0 [ x iterations ] 1000" Does the x value represent the amount of pieces of data that were passed (such as 1 image from a data set of images), or does it represent a full pass of the entire data set?

Réponse acceptée

Majid Farzaneh
Majid Farzaneh le 24 Mai 2018
Hello, In every neural network there is an optimization algorithm to set optimum weights and biases; and optimization algorithms are usually iterative. 1 epoch means one iteration in the optimization algorithm.
  3 commentaires
Majid Farzaneh
Majid Farzaneh le 24 Mai 2018
Yes, that's true. In every change for weights, network needs to calculate MSE and for MSE it needs to classify all training data with new weights.
Greg Heath
Greg Heath le 25 Mai 2018
Optimization algorithms TRY to optimize the goal. Many/most times they do not achieve the goal.
Nevertheless, they are often considered successful if they just get close enough.
For example, I often design neural networks to yield an output target t, given an input function x.
I take as a reference output
yref = mean(t')
the corresponding mean square error is
MSEref = mean(var(t',1))
My training goal is typically
MSEgoal = 0.01*MSEref
which preserves 99% of the target variance,

Connectez-vous pour commenter.

Plus de réponses (0)

Catégories

En savoir plus sur Deep Learning Toolbox dans Help Center et File Exchange

Produits

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by