MATLAB gives me different value of output every time I train a neural network, why?
5 vues (au cours des 30 derniers jours)
Afficher commentaires plus anciens
I was doing multilayer neural network. Input data (3 input data and 150 samples) - 3x150 target - 1x150
I did not specify the weight and bias, is it the reason to return different value of output every time I train the neural network?
0 commentaires
Réponse acceptée
Greg Heath
le 2 Juil 2015
The default data division and weight initialization are both random.
To reproduce a design you have to know the initial state of the RNG before it is both configured with initial weights and divided into training, validation and testing subsets.
When designing multiple nets in a double for loop (creation in the outer loop and training in the inner loop), you only have to initialize the RNG once: before the first loop. The RNG changes its state every time it is called. Therefore, for reproducibility, record the RNG state at the beginning of the inner loop.
Exactly when the RNG is called differs for the different generation of designs. For special cases of the obsolete NEWFF family (e.g., NEWFIT, NEWPR and NEWFF), weights are initialized when the nets are created. For special cases of the current FEEDFORWARDNET family, (e.g., FITNET, PATTERNNET and FEEDFORWARDNET), weights can be initialized explicitly by the CONFIGURE function. Otherwise, they will be automatically initialiized by the function TRAIN.
When I find out exactly where the data is divided, I will post in both the NEWSGROUP and ANSWERS.
Hope this helps.
Thank you for formally accepting my answer
Greg
Plus de réponses (1)
Walter Roberson
le 1 Juil 2015
The weights are initialized randomly unless you specifically initialize them.
Voir également
Catégories
En savoir plus sur Sequence and Numeric Feature Data Workflows dans Help Center et File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!