How to use a custom transfer function in neural net training
Afficher commentaires plus anciens
I want to use a function similar to tansig. I don't seem to be able to find a good example, and the tansig.apply method only allows me one line! I'm wrapped around this axle, and I suspect I'm missing something simple. Any ideas? I'm using 2012b.
Réponse acceptée
Plus de réponses (5)
Bob
le 27 Mar 2013
4 commentaires
Nn Sagita
le 29 Août 2013
Bob, I modified purelin transfer function, called 'mtf'. I saved in my working directory. I trained neural network and got outputs. But I got some messages too, like this:
Exception in thread "AWT-EventQueue-0" java.lang.NullPointerException at com.mathworks.toolbox.nnet.v6.diagram.nnTransfer.paint(nnTransfer.java:35) at com.mathworks.toolbox.nnet.v6.image.nnOffsetImage.paint(nnOffsetImage.java:49) at ....
Could you help me, what should I do?
kelvina
le 15 Fév 2014
thanks bob, it helps me
but we can directly do this by coping file 'template transfer' from :C:\Program Files (x86)\MATLAB\R2010a\toolbox\nnet\nnet\nncustom
and just replace its function :a = apply_transfer(n,fp) by your function and then save this file in your working directory. it will work.
Mayank Gupta
le 4 Mai 2016
Can you please explain in detail how to save a custom training function to the nntool directory ? I am using Firefly algorithm for optimization.
Mehdi Jokar
le 16 Juil 2018
Bob, thank you for you instructions. but, is apply the only function that needs to be modified? or we need to modify the backprop and forwardprop function in the + folder ?
Mehdi
Bob
le 10 Déc 2012
0 votes
Greg Heath
le 11 Déc 2012
Modifié(e) : DGM
le 23 Fév 2023
I cannot understand why you think y2 is better than y1
x = -6:0.1:6;
y1 = x./(0.25+abs(x));
y2 = x.*(1 - (0.52*abs(x/2.6))) % (for -2.5<x<2.5).
figure
hold on
plot(x,y1)
plot(x,y2,'r')
mladen
le 26 Mar 2013
0 votes
Could anybody upload some examples of modified tansig.m and +tansig folder? This would be very helpful for my project and for other people too. Thank You.
1 commentaire
Nn Sagita
le 29 Août 2013
If you have some examples how to modify transfer function, please share for me. Thank you.
mladen
le 29 Mar 2013
Thank you Bob. Nice trick with feedforwardnet.m (good for permanent use). I've managed to do this but some new questions arise:
- How to use param in apply(n,param) ? (more info-> matlabcentral/answers/686)?
- How to use different transfer functions within the same layer?
- My apply function looks something like this:
function A = apply(n,param)
%....
A=a1.*a2;
end
now I would like to use a1 and a2 to speedup the derivative computation in da_dn.m (this has already been done with tansig.m, but with the final value (A in my code))...is it possible?
Catégories
En savoir plus sur Deep Learning Toolbox dans Centre d'aide et File Exchange
Produits
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!