
Does the input size of sequenceInputLayer have to be set 1?
    6 views (last 30 days)
  
       Show older comments
    
i followed the example https://ww2.mathworks.cn/help/textanalytics/ug/classify-text-data-using-deep-learning.html.
The network is as follows:
inputSize = 1;
embeddingDimension = 50;
numHiddenUnits = 80;
numWords = enc.NumWords;
numClasses = numel(categories(YTrain));
layers = [ ...
    sequenceInputLayer(inputSize)
    wordEmbeddingLayer(embeddingDimension,numWords)
    lstmLayer(numHiddenUnits,'OutputMode','last')
    fullyConnectedLayer(numClasses)
    softmaxLayer
    classificationLayer]
I only tried to change inputSize = 2, then the network can not be trained with the error:

Does the  input size of sequenceInputLayer have to be set 1?
0 Comments
Answers (1)
  Zhiyu WANG
 on 25 May 2022
        Becasue the input sequence data is a 1 by 10 double.It menas each time you take 1 token out of the sequence.

0 Comments
See Also
Categories
				Find more on Deep Learning Toolbox in Help Center and File Exchange
			
	Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!
