Can I use parallel computing to train a DNN?

1 view (last 30 days)
Contained below is my code for a Neural Network I have designed. The network training runs fine on a single CPU but it is very slow, is it possible to use a parallel set up? I have made what i think are the required modifications but i get an error (also contained below), what is meant by a recurrent network and is there any way to overcome this error?
Error using trainNetwork (line 150)
Parallel training of recurrent networks is not supported. 'ExecutionEnvironment' value in trainingOptions function must be 'auto', 'gpu', or 'cpu'.
Error in Network2 (line 49)
network2 = trainNetwork(TrainingInputData,TrainingOutputData,layers,options)
Caused by:
Error using nnet.internal.cnn.assembler.setupExecutionEnvironment (line 17)
Parallel training of recurrent networks is not supported. 'ExecutionEnvironment' value in trainingOptions function must be 'auto', 'gpu', or
'cpu'.
clc
clear
parpool
[outputdata, inputdata,specifications] = dataprep();
TrainingOutputData = outputdata;
TrainingInputData = inputdata;
TestingOutputData = cell(1,1);
TestingInputData = cell(1,1);
TestingOutputData{1,1} = outputdata{4,1};
TestingInputData{1,1} = inputdata{4,1};
TrainingOutputData(4,:)=[];
TrainingInputData(4,:)=[];
maxEpochs = 10;
miniBatchSize = 1;
Neurons = 150000;
numResponses = size(TrainingOutputData{1},1);
featureDimension = size(TrainingInputData{1},1);
layers = [ ...
sequenceInputLayer(featureDimension)
fullyConnectedLayer(Neurons)
fullyConnectedLayer(numResponses)
regressionLayer];
options = trainingOptions('sgdm', ...
'MaxEpochs',maxEpochs, ...
'MiniBatchSize',miniBatchSize, ...
'InitialLearnRate',0.01, ...
'LearnRateSchedule','piecewise',...
'LearnRateDropPeriod',3,...
'LearnRateDropFactor',0.1,...
'GradientThreshold',1, ...
'Shuffle','never', ...
'Plots','training-progress',...
'ValidationData',{TestingInputData,TestingOutputData},...
'ValidationFrequency', 1,...
'Verbose', 1,...
'VerboseFrequency', 1,...
'ExecutionEnvironment', 'parallel');
network2 = trainNetwork(TrainingInputData,TrainingOutputData,layers,options)

Accepted Answer

Joss Knight
Joss Knight on 15 Feb 2019
No, you can't use parallel training for a sequence network, sorry.

More Answers (0)

Categories

Find more on Deep Learning Toolbox in Help Center and File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!