Reproducibility convolutional neural network training with gpu
5 views (last 30 days)
Show older comments
Diego Alvarez Estevez
on 20 Sep 2018
Commented: Mesho
on 24 May 2024
Hello,
I am training a CNN using my local GPU (to speed up training) for classification problems and would like to try different parameterizations. To avoid the variability effects due to different data and/or weights initialization I am resetting the random seeds each time before training:
% Initialize random seed (thus same dataset on same architecture would lead
% to predictable result)
rng(0);
%parallel.gpu.rng(0, 'CombRecursive');
randStream = parallel.gpu.RandStream('CombRecursive', 'Seed', 0);
parallel.gpu.RandStream.setGlobalStream(randStream);
% Train the CNN network
net = trainNetwork(TR.data,TR.reference,layers,options);
The problem is that when using GPU I am getting different results on each execution, even if initializing the GPU random seed to the same value. Strange thing is if I use CPU instead, then I do get the reproducible results. I am doing something wrong with GPU random seed initialization? Is there a know problem for this situation or something I am missing?
Thanks beforehand.
PS: I am using Matlab R2017b
0 Comments
Accepted Answer
Joss Knight
on 20 Sep 2018
Use of the GPU has non-deterministic behaviour. You cannot guarantee identical results when training your network, because it depends on the whims of floating point precision and parallel computations of the form (a + b) + c ~= a + (b + c).
Most of our GPU algorithms are in fact deterministic but a few are not, for instance, backward convolution.
14 Comments
More Answers (0)
See Also
Categories
Find more on Image Data Workflows in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!