Neural Network Output :Scaling the output range.
Afficher commentaires plus anciens
Hi,
The output layer of my neural network (3 layered) is using sigmoid as activation which outputs only in range [0-1]. However, if I want to train it for outputs that are beyond [0-1], say in thousands, what should I do?
For example if I want to train
input ----> output
0 0 ------> 0
0 1 ------> 1000
1000 1 ----> 1
1 1 -------> 0
My program works for AND, OR, XOR etc. As input output are all in binary.
There were some suggestion to use,
Activation:
-----------
y = lambda*(abs(x)*1/(1+exp(-1*(x))))
Derivative of activation:
-------------------------
lambda*(abs(y)*y*(1-y))
This did not converge for the mentioned training pattern. Are there any suggestion please?
Réponse acceptée
Plus de réponses (1)
Greg Heath
le 29 Jan 2012
0 votes
If the target has rigid bounds, scale the data to either [0,1] or [-1,1] and use either LOGSIG or TANSIG, respectively.
Otherwise, standardize to zero-mean/unit variance and use PURELIN.
To recover the original data scale, just use the reverse tranformations.
Hope this helps.
Greg
1 commentaire
Ashikur
le 29 Jan 2012
Catégories
En savoir plus sur Deep Learning Toolbox dans Centre d'aide et File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!