Update a parameter which is not learnable in Custom Layers Deep Learning
2 views (last 30 days)
I am working on a deep learning project in which I use a custom layer. In this layer, I have a parameter, α, that depend of the weight. I want to update when the weight is updated thus, for me, α is not a learnable parameter. Here is my predict function :
function [Z] = predict(layer, X)
% Z = predict(layer, X1, ..., Xn) forwards the input data X1,
% ..., Xn through the layer and outputs the result Z.
W = layer.Weights;
% Initialize output
Z = zeros(layer.OutputSize,numel,"single");
%alpha coef calculation
for j= 1:size(layer.Graphe.neighbors(layer.TargetNode))
% Weighted addition
Alpha is declared as a parameter here
However when my training ends net.layer(3,1).Alpha gives me the initial value of α and it is the same thing in the backward function.
How can I do to update α ?
Thank you in advance for your futur help.
Katja Mogalle on 21 Jan 2022
The technical and not too helpful answer is that custom layers are not handle classes and the predict function doesn't return the modified layer object so the framework doesn't/can't get the updated layer.
The question is, what do you plan to do with Alpha? From the sounds of it, you want to look at it after training? Or during training? Is it used anywhere in the training process?
One possible solution would be to declare a second output on the custom layer (look for NumOutputs and OutputNames properties in this custom layer doc page) which returns the alpha value. If you use dlnetwork and custom training loops, you can easily get the second (Alpha) output at any time during or after training without actually using it in the training process.
I hope this helps you move forward with your project.