Feedforward net - how to use LeakyReLU or scaled exponential linear unit for the hidden layers?

1 vue (au cours des 30 derniers jours)
In a multi-layer shallow network using feedforwardnet, how to use different activation functions like Leaky ReLU or Scaled exponential linear unit in the hidden layers? The default function supported seem to be only tansig for the hidden layers.
  2 commentaires
Ihsan Ullah
Ihsan Ullah le 3 Avr 2019
Did you get an answer to your question? If you have sorted out this, would you please write the code in the comment section?
Thank you

Connectez-vous pour commenter.

Réponses (0)

Produits


Version

R2018a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by