gradient descent for custom function
5 vues (au cours des 30 derniers jours)
Afficher commentaires plus anciens
I have four equations:
1) e = m - y
2) y = W_3 * h
3) h = z + W_2 * z + f
4) f = W_1 * x
I want to update W_1, W_2 and W_3 in order to minimize a cost function J = (e^T e ) by using gradient descent.
x is an input, y is the output and m is the desired value for each sample in the dataset
I would like to do W_1 = W_1 - eta* grad(J)_w_1
W_2 = W_2 - eta* grad(J)_w_2
W_3 = W_3 - eta* grad(J)_w_3
Going through documentation I found out that you can train standard neural networks. But notice that I have some custom functions, so I guess it would be more of an optimization built in function to use.
Any ideas?
2 commentaires
Matt J
le 24 Avr 2024
x is an input, y is the output and m is the desired value for each sample in the dataset
It looks like z is also an input. It is not given by any other equations.
Réponses (2)
Matt J
le 24 Avr 2024
Modifié(e) : Matt J
le 24 Avr 2024
so I guess it would be more of an optimization built in function to use.
No, not necessarily. Your equations can be implemented with fullyConnectedLayers and additionLayers.
3 commentaires
Torsten
le 24 Avr 2024
Déplacé(e) : Torsten
le 24 Avr 2024
e = m - y = m - W_3*h = m - W_3*(z + W_2 * z + W_1 * x )
Now if you formulate this as
e = W1*z + W2*x - m
with
W1 = W_3 + W_2*W_3 and W2 = W_1*W_3
your problem is
min: || [z.',x.']*[W1;W2] - m ||_2
and you can use "lsqlin" to solve.
10 commentaires
Voir également
Catégories
En savoir plus sur Sequence and Numeric Feature Data Workflows dans Help Center et File Exchange
Produits
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!