how to include four hidden layers by taking away LSTM.

I am having a code that implements LSTM layer as below:
inputSize = 12;
numHiddenUnits1 = 48;
numHiddenUnits2 = 48;
numHiddenUnits3 = 48;
numHiddenUnits4 = 48;
numClasses = 12;
layers = [ ...
sequenceInputLayer(inputSize)
lstmLayer(numHiddenUnits1,'OutputMode','sequence')
lstmLayer(numHiddenUnits2,'OutputMode','sequence')
lstmLayer(numHiddenUnits3,'OutputMode','sequence')
lstmLayer(numHiddenUnits4,'OutputMode','sequence')
fullyConnectedLayer(numClasses)
reluLayer
regressionLayer]
Now I want to implement four hidden layers without LSTM. So anyone can help me how to modify it.

 Accepted Answer

I understand you would like to add 4 fully connected hidden layers without LSTM.
The same fullyConnectedLayer() function can be used with your choice of hyperparemeters to achieve your aim. Please refer to this for the documentation of the abovementioned function.
I hope this helps.

4 Comments

Yes. I want four hidden layers with (48 hidden nodes each) without LSTM.
I tried with the following lines
inputSize = 12;
numHiddenUnits1 = 48;
numHiddenUnits2 = 48;
numHiddenUnits3 = 48;
numHiddenUnits4 = 48;
numClasses = 12
layers = [ ...
sequenceInputLayer(inputSize)
fullyConnectedLayer(numHiddenUnits1,numHiddenUnits2,numHiddenUnits3,numHiddenUnits4,numClasses)
reluLayer
regressionLayer] but I am getting error. Could you please help me to overcome the error.
For 4 hidden layers, you can try the following:
inputSize = 12;
numHiddenUnits1 = 48;
numHiddenUnits2 = 48;
numHiddenUnits3 = 48;
numHiddenUnits4 = 48;
numClasses = 12
layers = [ ...
sequenceInputLayer(inputSize)
fullyConnectedLayer(numHiddenUnits1)
reluLayer
fullyConnectedLayer(numHiddenUnits2)
reluLayer
fullyConnectedLayer(numHiddenUnits3)
reluLayer
fullyConnectedLayer(numHiddenUnits4)
reluLayer
fullyConnectedLayer(numClasses)
reluLayer
regressionLayer]
Please refer to this link for in depth documentation on all the deep learning layers and also some examples that you can view on how to use them.
I hope this helps.
Could you please help me how to add sine, cosine, tanh activation by replacing reLu layer for the above code.

Sign in to comment.

More Answers (0)

Categories

Find more on Deep Learning Toolbox in Help Center and File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!