MATLAB Answers

Get input/output gradient of neural network

39 views (last 30 days)
Aaron Kandel
Aaron Kandel on 11 Sep 2020
Edited: David Leather on 24 Nov 2020 at 23:07
Matlab's built in functions in the NN toolbox seem to provide a good set of options for getting the gradient of the network performance wrt the network parameters. Is there a way to get the gradient of the network output with respect to the network input?

  0 Comments

Sign in to comment.

Answers (1)

Mahesh Taparia
Mahesh Taparia on 14 Sep 2020
Hi
In general, in any neural network, the network tries to learn the weights which can reduce the cost/ loss function. The gradients are updated iteratively by using the derivative of loss function with respect to weights.
Usually for a fix input, calculating gradients of loss with respect to input is not meaningful because if input is fix, then d(loss)/d(Input) is not defined. If the network is feed with 2 different input sequence, in this case you can find the gradient by calculating (Loss2-Loss1)/(X2-X1), where Loss is the value of network loss with respect to input X. There is no use of this while training the network.
Hope it will helps!

  5 Comments

Show 2 older comments
Mahesh Taparia
Mahesh Taparia on 17 Sep 2020
Hi
Mostly the neural network is highly non linear which consists of several activation functions.As of now there is no direct function which can calculate the gradient of the network output with respect to the network input. The approach is to derive the equations of derivative of the network. However, you can use the trained network which contains the trained weights and the used activations function. It can help in evaluation. For example, let F1 and F2 be the activation function and W1, W2 be the trained weights and B1 and B2 be the trained bias of each layer, then
Y=F2(W2*F1(W1*X+B1)+B2);
In this case, Y' (derivative with respect to X) can be written as:
Y'=W2*F2'{W2*(F1(W1*X+B1))+B2}*W1*{F1'(W1*X+B1)};
So, the nodes output are passed through the derivative of activation and the weights are getting multiplied. The nodes outputs and weights are stored in the trained network and for each layer. The derivatives of activation can be made as a separate function for simplicity.
Hope it will helps!
Aaron Kandel
Aaron Kandel on 21 Sep 2020
Thank you for your input and help! I wish Matlab had the full functionality to provide analytic gradients with built in functions, but hard coding them should work for the project I'm working on.
David Leather
David Leather on 24 Nov 2020 at 23:07
This seems like an oversight. When applying the trained neural network to other applications, it is essential to be able to evaluate the gradient wrt to the output of the neural network, and not the loss function....

Sign in to comment.

Products


Release

R2020a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by