Bug in Neural Net calcs for log sigmoid?

2 vues (au cours des 30 derniers jours)
Nathan Zimmerman
Nathan Zimmerman le 10 Nov 2016
Hey all, wanted to try a basic XOR network with sigmoid activation functions. Net: 1 hidden layer size 3. So w/ biases, there 13 weights total since 2 inputs. See image for code/results: http://i.imgur.com/PNinXLc.png
Training was successful, the net outputs as it should, but the weights/biases provided do not make sense. For the test value of 0,0, the output should be 0 and is 0. However, for 0,0 , only the biases should determine the output. For first set of biases: net.b{1} ans = -5.3213 -5.6918 -8.9866 So all 3 inner layer outputs should be 0 once activated by sigmoid leaving the final bias as the determining factor. net.b{2} ans = 9.5846 So ~9.5 activated by a sigmoid is 1 as opposed to 0.
So either the function yielding the biases is incorrect or something with the evaluation isn't as specified. Any insight would be appreciated, my matlab version is 2015b
I also manually calculated it via the following code:
y1 = logsig( net.IW{1} * [0;0] + net.b{1})
y2 = logsig( net.LW{2}*y1 + net.b{2})
y2 =0.9999 % Result
net([0;0]) = 2.2550e-04
Again, there is some inconsistency going on here that is either a bug or undocumented. :/

Réponse acceptée

Steven Lord
Steven Lord le 10 Nov 2016
I believe you're forgetting the preprocessing step.
  1 commentaire
Nathan Zimmerman
Nathan Zimmerman le 10 Nov 2016
Thankyou, that is correct and solved my issue. Input should be normalized between -1,1. So [0;0] --> [-1;-1]. If I use [-1;-1] w/ my prior I get the same answer as net([0;0])

Connectez-vous pour commenter.

Plus de réponses (0)

Catégories

En savoir plus sur Deep Learning Toolbox dans Help Center et File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by