the output size of the last layer doesn't match the number of classes

1 view (last 30 days)
I want to build seven inputs, one output network. (11 classes; 1, 2, ... , 11)
I know I should use combined datastore or transformed datastore type for input.
Also, I made a mat file is already combined all of inputs and label in a cell.
I created 3D image arrays for each input and labels array for training.
The mat file consists of {1x8}; {1x (7 input arrays, 1 categorical label)} in the CombinedCell variable
CombinedCell =
1×8 cell array
{10×9×640 double} {10×9×640 double} {10×9×640 double} {10×9×640 double} {10×9×640 double} {10×9×640 double} {10×9×640 double} {[1]}
Then I load all files using datastore function by following the code.
function [trainingDatastore]=read_datastore(folder)
tempdatastore = datastore(folder,'ReadFcn',@load,'IncludeSubfolders',1,'Type','file','FileExtensions','.mat');
trainingDatastore = transform(tempdatastore,@rearrangeData);
function out = rearrangeData(ds)
out = ds.CombinedCell;
end
end
My network's structure code:
nClasses = 11;
inputSize = [10 9 640 1];
lgraph = layerGraph();
for ind=1:length(Multi_input)
Input = [image3dInputLayer(inputSize,"Name",Multi_input{ind})
convolution3dLayer([1 1 16],32,"Name",[Multi_input{ind} '_conv3d_1'],"Padding","same")
leakyReluLayer(0.01,"Name",[Multi_input{ind} '_leaky1'])
maxPooling3dLayer([1 1 12],"Name",[Multi_input{ind} '_pool1'],"Padding","same")
averagePooling3dLayer([1 1 64],"Name",[Multi_input{ind} '_avrpool1'],"Stride",[1 1 32])
convolution3dLayer([10 9 1],16,"Name",[Multi_input{ind} '_spa1'],"Padding","same")
leakyReluLayer(0.01,"Name",[Multi_input{ind} '_leaky2'])
maxPooling3dLayer([5 5 1],"Name",[Multi_input{ind} '_pool2'],"Padding","same")];
lgraph = addLayers(lgraph,Input);
clear Input
end
bottom = [
additionLayer(length(Multi_input),"Name","addition")
convolution3dLayer([1 1 5],16,"Name",'Combined_conv3d_1',"Stride",1)
fullyConnectedLayer(inputSize(1)*inputSize(2)*16,"Name","fc1")
fullyConnectedLayer(nClasses,"Name","fc2")
softmaxLayer("Name","softmax")
classificationLayer("Name","classoutput")];
lgraph = addLayers(lgraph,bottom);
for ind=1:length(Multi_input)
lgraph = connectLayers(lgraph,[Multi_input{ind} '_pool2'],['addition/in' num2str(ind)]);
end
[trainedNet, traininfo]= trainNetwork(trainingDatastore,lgraph,options);
Matlab show an error "the output size (11) of the last layer doesn't match the number of classes (1)
How should I fix the problem?
  2 Comments
Walter Roberson
Walter Roberson on 12 Jan 2021
You do not show us how you are moving from the datastore returning a cell array, into the form needed by the layers ?
Youngmin Na
Youngmin Na on 12 Jan 2021
I used this code to train the network: [trainedNet, traininfo]= trainNetwork(trainingDatastore,lgraph,options);

Sign in to comment.

Answers (1)

Walter Roberson
Walter Roberson on 12 Jan 2021
A 1x8 cell array at that point codes a single sample with a single data class. Your input must have at least one representative each from all 11 classes.
  1 Comment
Youngmin Na
Youngmin Na on 13 Jan 2021
I modified my data. A 11x8 cell array at a single sample now. Even the data have all classes samples, the error message was appeared.
What should I do?
11×8 cell array
{10×9×640 double} {10×9×640 double} {10×9×640 double} {10×9×640 double} {10×9×640 double} {10×9×640 double} {10×9×640 double} {[1 ]}
{10×9×640 double} {10×9×640 double} {10×9×640 double} {10×9×640 double} {10×9×640 double} {10×9×640 double} {10×9×640 double} {[2 ]}
{10×9×640 double} {10×9×640 double} {10×9×640 double} {10×9×640 double} {10×9×640 double} {10×9×640 double} {10×9×640 double} {[3 ]}
{10×9×640 double} {10×9×640 double} {10×9×640 double} {10×9×640 double} {10×9×640 double} {10×9×640 double} {10×9×640 double} {[4 ]}
{10×9×640 double} {10×9×640 double} {10×9×640 double} {10×9×640 double} {10×9×640 double} {10×9×640 double} {10×9×640 double} {[5 ]}
{10×9×640 double} {10×9×640 double} {10×9×640 double} {10×9×640 double} {10×9×640 double} {10×9×640 double} {10×9×640 double} {[6 ]}
{10×9×640 double} {10×9×640 double} {10×9×640 double} {10×9×640 double} {10×9×640 double} {10×9×640 double} {10×9×640 double} {[7 ]}
{10×9×640 double} {10×9×640 double} {10×9×640 double} {10×9×640 double} {10×9×640 double} {10×9×640 double} {10×9×640 double} {[8 ]}
{10×9×640 double} {10×9×640 double} {10×9×640 double} {10×9×640 double} {10×9×640 double} {10×9×640 double} {10×9×640 double} {[9 ]}
{10×9×640 double} {10×9×640 double} {10×9×640 double} {10×9×640 double} {10×9×640 double} {10×9×640 double} {10×9×640 double} {[10]}
{10×9×640 double} {10×9×640 double} {10×9×640 double} {10×9×640 double} {10×9×640 double} {10×9×640 double} {10×9×640 double} {[11]}

Sign in to comment.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!