could you anyone help me how to include sine, cosine and tanh activation function for training the neural network
1 vue (au cours des 30 derniers jours)
Afficher commentaires plus anciens
In my code i have written the
layers = [ ...
sequenceInputLayer(inputSize)
fullyConnectedLayer(numHiddenUnits1)
reLuLayer
fullyConnectedLayer(numHiddenUnits2)
reLuLayer
fullyConnectedLayer(numClasses)
reLuLayer
regressionLayer]
Now i want to execute the code using sine, cosine and tanh instead of reLu.
Could anyone please help me on this.
0 commentaires
Réponses (1)
Akshat
le 27 Août 2024
Hi Jaah,
I see you want to use different activation functions instead of reLu.
In the case of "tanh", you can use the "tanh layer", about which you understand here:
Using "sine" and "cosine" as activation functions is not a viable choice, as "sine" and "cosine" are periodic functions and they have many local extrema. Thus, we lose the uniqueness of values. Due to this reason, it is not a popular choice to use these functions as the activation functions.
Hope this helps!
Akshat
0 commentaires
Voir également
Catégories
En savoir plus sur Sequence and Numeric Feature Data Workflows dans Help Center et File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!