Use relu function for lstmlayer
Afficher commentaires plus anciens
I would like to change the StateActivationFuction of lstmLayer to Relu fuction, but only 'tanh' and 'softsign' are supported in the deep learning tool box.
Is there any solutions for changing the activation function ,or the way to make customed lstmLayer with Relu as the StateActivation?
Réponses (1)
slevin Lee
le 21 Oct 2022
0 votes
GateActivationFunction — Activation function to apply to the gates
- 'sigmoid' – Use the sigmoid function σ(x)=(1+e−x)−1.
- 'hard-sigmoid' – Use the hard sigmoid function
no Relu fuction
╮(╯▽╰)╭
Catégories
En savoir plus sur Deep Learning Toolbox dans Centre d'aide et File Exchange
Produits
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!