Effacer les filtres
Effacer les filtres

How to conditionally define a learnable property?

3 vues (au cours des 30 derniers jours)
John Smith
John Smith le 23 Mar 2023
Commenté : Catalytic le 24 Mar 2023
Hello,
Is it possible, and if so - how, to define a learnable property based on a condition checked in the layer constructor?
What I mean is that currently I define all learnable properties in the properties(Learnable) block of layer definition. I would like to have some properties to exist only if a certain condition is set, checked in the constructor, e.g.,
properties(Learnable)
end
function self = constructor(cond)
if cond
self.condprop = addprop(self,'condprop');
self.condprop.Learnable = true;
end
end
I tried using dynamic properties (dynamicprops), but this doesn't work because it inherits from the handle class, which the other super-classes of the layer do not.
Thx

Réponses (3)

Matt J
Matt J le 23 Mar 2023
Modifié(e) : Matt J le 23 Mar 2023
If you really must have a layer with different properties, based on a conditional flag setting, it would probably be better to just replace the layer in the network with a different class of layer, which can also be done conditionally.
loc=contains( {Layers.Description}, something);
if cond
Layers(loc)=newlayer;
end

Matt J
Matt J le 23 Mar 2023
An indirect solution would be to have an additional property Wknown that lets you provide an over-riding prior value for a particular learnable property W.
classdef myLayer < nnet.layer.Layer % ...
properties
Wknown=[] ; %over-ride for learnable property W
end
properties(Learnable)
W
end
function layer = myLayer(wknown)
if nargin
layer.Wknown=wknown;
end
end
When a non-empty value for Wknown isn't provided in the constructor, your forward() and backward() method will treat the learnable parameter W in the normal way. When Wknown is provided, you write the forward() method to use Wknown instead of W in creating the output prediction, and you write the backward method to return dLdW=0.
methods
function [Z,state,memory] = forward(layer,X)
cond=isempty(layer.Wknown)
if cond
W=layer.W;
Z=...
else
W=layer.Wknown;
Z=...
end
end
function [dLdX,dLdW,dLdSin] = backward(layer,X,Z,dLdZ,dLdSout,memory)
cond=isempty(layer.Wknown)
if cond
W=layer.W;
dLdW=...
else
W=layer.Wknown;
dLdW=0;
end
end
end
end

Matt J
Matt J le 23 Mar 2023
Modifié(e) : Matt J le 23 Mar 2023
You could also set the learning rate of a particular parameter to zero, based on your conditional flag,
which would have the effect of not updating that parameter ever, and thus treating it as a constant.
  2 commentaires
John Smith
John Smith le 23 Mar 2023
Thx for the suggestions.
However, this and the previous methods still allocate the learnable property W. I was looking for a way to save on allocating it, if not needed, s.t. it doesn't exist in the Learnables table of the dlnetwork.
Matt J
Matt J le 23 Mar 2023
Modifié(e) : Matt J le 23 Mar 2023
If the goal is to save on memory allocation, you could just allocate it a scalar or NaN to it when it is not going to be used.

Connectez-vous pour commenter.

Produits


Version

R2022b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by