GAN Neural Network ( Project and Reshape Layer ) error.

25 views (last 30 days)
I copied the code from the following path https://uk.mathworks.com/help/deeplearning/ug/train-generative-adversarial-network.html#TrainGenerativeAdversarialNetworkGANExample-7 and try to run it on Matlab 2019a version it give error on the varialbe projectandreshapelayer that it is undefined function and variable. Image is attached.
  2 Comments
Manal Allehyani
Manal Allehyani on 21 Sep 2020
i have same error , but my version r2020
did you solve it?
Alexander Hagg
Alexander Hagg on 7 Nov 2020
The projectAndReshapeLayer is a custom example that was provided to Matlab in one of the example. Open it using the following command and save it to your local project (or add it to your path).
edit(fullfile(matlabroot,'examples','nnet','main','projectAndReshapeLayer.m'))

Sign in to comment.

Answers (1)

Yonglei GUI
Yonglei GUI on 21 Feb 2021
classdef projectAndReshapeLayer < nnet.layer.Layer
properties
% (Optional) Layer properties.
OutputSize
end
properties (Learnable)
% Layer learnable parameters.
Weights
Bias
end
methods
function layer = projectAndReshapeLayer(outputSize, numChannels, name)
% Create a projectAndReshapeLayer.
% Set layer name.
layer.Name = name;
% Set layer description.
layer.Description = "Project and reshape layer with output size " + join(string(outputSize));
% Set layer type.
layer.Type = "Project and Reshape";
% Set output size.
layer.OutputSize = outputSize;
% Initialize fully connect weights and bias.
fcSize = prod(outputSize);
layer.Weights = initializeGlorot(fcSize, numChannels);
layer.Bias = zeros(fcSize, 1, 'single');
end
function Z = predict(layer, X)
% Forward input data through the layer at prediction time and
% output the result.
%
% Inputs:
% layer - Layer to forward propagate through
% X - Input data, specified as a 1-by-1-by-C-by-N
% dlarray, where N is the mini-batch size.
% Outputs:
% Z - Output of layer forward function returned as
% an sz(1)-by-sz(2)-by-sz(3)-by-N dlarray,
% where sz is the layer output size and N is
% the mini-batch size.
% Fully connect.
weights = layer.Weights;
bias = layer.Bias;
X = fullyconnect(X,weights,bias,'DataFormat','SSCB');
% Reshape.
outputSize = layer.OutputSize;
Z = reshape(X, outputSize(1), outputSize(2), outputSize(3), []);
end
end
end
function weights = initializeGlorot(numOut, numIn)
% Initialize weights using uniform Glorot.
varWeights = sqrt( 6 / (numIn + numOut) );
weights = varWeights * (2 * rand([numOut, numIn], 'single') - 1);
end

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!