LSTM Custom Regression output layer for time series

4 views (last 30 days)
Hello everyone!
I'm implementing an LSTM with custom regression output layer, as shown in https://www.mathworks.com/help/deeplearning/ug/define-custom-regression-output-layer.html.
Ideally, I would like to have as extra input variable of the forward and backward fuctions the input of the network, but in the toolbox the function is limited to layer, Y and T.
My structure is made by:
  • layer1 = sequenceInputLayer(numFeatures);
  • layer2 = lstmLayer(numHiddenUnits);
  • layer3 = fullyConnectedLayer(numResponses);
  • layer4 = myRegressionLayer;
I've created myRegressionLayer (layer, T, Y, X) and I've tried to modify the functions in both in nnet.cnn.layer.RegressionLayer and nnet.cnn.layer.Layer to expect an extra input variable (the network input, X), but I get an error from CNNException.m:
"Error in LSTM (line 79)
[net,temp] = trainNetwork(XTrain,YTrain,layers,options);
Caused by:
Insufficient number of outputs from right hand side of equal sign to satisfy assignment."
Anyone that can help me with this? I need to pass XTrain to my regression layer alongside layer, Y and T.
I'm working with time series, 2 features in input and one output.
Thanks!!
Silvia

Accepted Answer

Raunak Gupta
Raunak Gupta on 20 Nov 2019
Hi,
For adding Multiple Inputs or Learnable parameters to a Custom Layer you may try following below mentioned documentations.
Note that for Custom Layers you may have to define forward and backward function which can be non-intuitive to derive from the mathematical form especially the derivative of loss function is case of backward function.
  9 Comments
Raunak Gupta
Raunak Gupta on 2 Jan 2020
Hi,
For the first question, yes you may need to define the custom layer which can take multiple inputs but by mentioning learnable parameters in the layer to be only the weights of that custom regression layer you may retrieve the data without creating any other learnable parameter. Therefore the derivative should only contain parameters of the fully connected layer.
For the second question, X is the data that is feed to the layer as input rather than the original training data.You may take training data via connectLayer into one more variable X2.
Hope everything is clear now. You may take a closer look to the two documentation that I mentioned in the Answer.
Shubham Baisthakur
Shubham Baisthakur on 15 Jun 2023
Hello Raunak and Silvia,
I am facing exactly the same issue and I was wondering if you wiil be able to help. Follwing is the definition of my custom regression layer
classdef customLossLayerMultiInput < nnet.layer.RegressionLayer & nnet.layer.Acceleratable
% Custom regression layer with mean-absolute-error loss and additional properties.
properties
node_properties
numFeature
end
methods
function layer = customLossLayerMultiInput(name, node_properties, numFeature)
% Constructor
% layer.NumInputs = numInputs;
layer.Name = name;
layer.Description = 'Physics-Informed loss function for LSTM training';
layer.node_properties = node_properties;
layer.numFeature = numFeature;
end
function loss = forwardLoss(layer, Y, T, varargin)
% Calculate the forward loss
% Reshape predictions and targets
Y = reshape(Y, [], 1);
T = reshape(T, [], 1);
X1 = varargin{1};
X2 = varargin{2};
% Sequence input data
sequence_input_data = reshape(X1, [], layer.numFeature);
% Calculate mean residue
mean_residue = PI_BEM_Residue(Y, T, sequence_input_data, layer.node_properties);
% Calculate RMSE loss
rmse_loss = rmse(Y, T);
% Total loss
loss = mean_residue + rmse_loss;
end
end
end
And my network architecture is:
layers = [
sequenceInputLayer(numFeatures, 'Name', 'inputLayer') % Define the sequence input layer and name it
lstmLayer(num_hidden_units, 'OutputMode', 'sequence', 'Name', 'lstmLayer') % Define the LSTM layer and name it
fullyConnectedLayer(1, 'Name', 'fullyConnectedLayer') % Define the fully connected layer and name it
dropoutLayer(x.dropout_rate, 'Name', 'dropoutLayer') % Define the dropout layer and name it
customLossLayerMultiInput(LayerName, node_properties,numFeatures)
];
% Create a layer graph
lgraph = layerGraph(layers);
lgraph = connectLayers(lgraph,"inputLayer",strcat(LayerName,'\in2'));
However I am not able to have two inputs for the custom loss layer. Now as I understand, the custom regression layer can not have multiple inputs. But if I want to use a "nnet.layer.Layer" class to define a custom layer with mutiple inputs, how do I access Y and T in that layer? Also how do I set this layer to work as a regression layer?

Sign in to comment.

More Answers (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!