Neural network - Why are the outputs not within -1 and 1 when i apply tansig as the activation function in the output layer?

2 vues (au cours des 30 derniers jours)
I got outputs greater than 1 (it ranges from 0.sth to 11.sth) when i use tansig as the activation function in the output layer. My neural network has the architecture of (4,6,5,1).
  1 commentaire
Vishnu
Vishnu le 16 Juin 2023
Hi JUN HANG,
Whatever the input to the "tansig" function, output should be in the range [-1,1].
Because the equation of the "tansig" is :
tansig(x) = (2/(1+exp(-2*x)))-1;
I suggest you to try it by normalizing the input values and weights of the network. If it still gives the output beyond the expected range, you can attach your neural network here I will look into it.

Connectez-vous pour commenter.

Réponses (1)

Krishna
Krishna le 4 Jan 2024
Hello OOI JUN HANG,
From what I gather, you're having trouble in achieving outputs in the interval of [-1, 1] with the tansig function. The 'tansig' activation function is designed to yield results that always fall between -1 and 1, irrespective of the architecture it's applied to. Formula for tansig is,
tansig(x) = 2/(1+exp(-2*x)) – 1 = (1 – exp(-2*x))/(1+exp(-2*x)) ---- (2)
now if you just take the reciprocal of exp(-2*x) in both numerator and denominator we get,
tansig(x) = (exp(2*x)-1) / (exp(2*x) +1) ---- (3)
Now when x tends to infinity tansig(x) tends to 1 as exp(-2*x) becomes zero and we are left with (1/1) (see equation 2).
When x tends to -1*infinity tansig(x) tends to -1 as exp(2*x) becomes zero and we are left with (-1/1) (see equation 3).
This is why the range for tansig is [-1,1]. For more information, please go through this documentation,
Hope this helps.

Catégories

En savoir plus sur Deep Learning Toolbox dans Help Center et File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by