Is there a way I can add a linear layer to my recurrent network?

3 views (last 30 days)
I am training a recurrent network that has a LSTM layer and I would like to add a linear layer after the LSTM. I found documentation on how to create a linear network, but not a linear layer. Also, MATLAB does not allow me to add the linear network as a layer after the LSTM.
This is the basic architecture of the network:
net = feedforwardnet;
layers = [ ...
sequenceInputLayer(inputSize)
lstmLayer(numHiddenUnits)
linearlayer
fullyConnectedLayer(numResponses)
regressionLayer
];
Whenever the linearlayer function is added, the error message:
Error using nnet.cnn.layer.SequenceInputLayer/vertcat
Cannot convert from 'network' to 'nnet.cnn.layer.SequenceInputLayer'.
Error in LSTM_Paper1_Copy (line 11)
sequenceInputLayer(inputSize)

Accepted Answer

J R
J R on 30 Jun 2018
Edited: J R on 30 Jun 2018
I think the fullyConnectedLayer acts as a linear layer. (and technically is an actual linear layer)
"A fully connected layer multiplies the input by a weight matrix W and then adds a bias vector b."
In your case if you put this kind of layer right before the other fullyConnectedLayer there probably won't be any influence on the result.
because two linear layers in row can be replaced by one linear layer.
The result of 2 linear layers is = W2*W1*input+(W2*b1+b2) This result can be equally being made using one linear layer with weight matrix W=W2*W1 and bias b=(W2*b1+b2)

More Answers (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!