Bug in Neural Net calcs for log sigmoid?
2 vues (au cours des 30 derniers jours)
Afficher commentaires plus anciens
Nathan Zimmerman
le 10 Nov 2016
Commenté : Nathan Zimmerman
le 10 Nov 2016
Hey all, wanted to try a basic XOR network with sigmoid activation functions. Net: 1 hidden layer size 3. So w/ biases, there 13 weights total since 2 inputs. See image for code/results: http://i.imgur.com/PNinXLc.png
Training was successful, the net outputs as it should, but the weights/biases provided do not make sense. For the test value of 0,0, the output should be 0 and is 0. However, for 0,0 , only the biases should determine the output. For first set of biases: net.b{1} ans = -5.3213 -5.6918 -8.9866 So all 3 inner layer outputs should be 0 once activated by sigmoid leaving the final bias as the determining factor. net.b{2} ans = 9.5846 So ~9.5 activated by a sigmoid is 1 as opposed to 0.
So either the function yielding the biases is incorrect or something with the evaluation isn't as specified. Any insight would be appreciated, my matlab version is 2015b
I also manually calculated it via the following code:
y1 = logsig( net.IW{1} * [0;0] + net.b{1})
y2 = logsig( net.LW{2}*y1 + net.b{2})
y2 =0.9999 % Result
net([0;0]) = 2.2550e-04
Again, there is some inconsistency going on here that is either a bug or undocumented. :/
0 commentaires
Réponse acceptée
Plus de réponses (0)
Voir également
Catégories
En savoir plus sur Deep Learning Toolbox dans Help Center et File Exchange
Produits
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!