How to use different transfer functions within the same layer of a neural network?

2 vues (au cours des 30 derniers jours)
mladen
mladen le 24 Avr 2013
For example, hidden layer with 3 different neurons: 1.tansig, 2.logsig, 3.elliotsig. I understand that this influences normalization range and that training might not resault in good performance but i need it to test some other aspects. Thank you.

Réponses (1)

Greg Heath
Greg Heath le 24 Avr 2013
If it were possible with the NNTBX, you would have to design a custom net.
The only way I can see doing it is to have 3 hidden layers that are only connected to the input and output but not to each other.
See the custom network section in the documentation.
Hope this helps.
Thank you for formally accepting my answer
Greg
  1 commentaire
mladen
mladen le 30 Avr 2013
Exactly this were my thoughts too. If and when I finish my work on this I'll post it here.

Connectez-vous pour commenter.

Catégories

En savoir plus sur Sequence and Numeric Feature Data Workflows dans Help Center et File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by