How can I constrain neural network weights?

2 vues (au cours des 30 derniers jours)
Luke Wilhelm
Luke Wilhelm le 7 Déc 2012
I am using the neural network toolbox to create a feed forward network. The input is one 4x1 vector, then there is one 4-neuron hidden layer, one 6-neuron hidden layer, and one 4-neuron output layer. I would like to be able to constrain the final 4x6 matrix of layer weights such that the weight values cannot be negative. I realize that this will probably affect the network's accuracy, but for the purpose of my research, I would like to see what the results are.
Is it possible to constrain the layer weights in this way? I have found how to set layer weights to a specified value and prevent their learning using net.layerWeights{i,j}.learn=false;, but not how to allow wights to change, while preventing them from becoming negative.
Thanks, Luke
  1 commentaire
Greg Heath
Greg Heath le 9 Déc 2012
One hidden layer is sufficient for a universal approximator.
If the hidden node activation functions are all odd, changing the sign of all weights connected to one activation function will not change the output.
Therefore, if there is only one output node, the task is easy.
Otherwise, it will not work in general.

Connectez-vous pour commenter.

Réponses (2)

R L
R L le 24 Juil 2015
I would like to ask you how did you a subset of the layer weights to a specified value while preventing their learning using net.layerWeights{i,j}.learn=false.
Have you ever solved your question regarding constraining the weights to have a specified sign while training with learning? thanks

Sara Perez
Sara Perez le 12 Sep 2019
You can set the propiety value of the layer 'WeightLearnRateFactor' to zero, so the weights won't be modified or learned
more info here:

Catégories

En savoir plus sur Sequence and Numeric Feature Data Workflows dans Help Center et File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by