Using multi-gpu in minibatchqueue
4 views (last 30 days)
I want to use multiple gpu's for my program. But, when I set the output environment of minibatchqueue to gpu, I guess , it is taking only one gpu.
There is no provision of using "multi-gpu" for outputenvironment. How can I use all of the GPU's using the minibatchqueue.
Joss Knight on 24 Oct 2021
https://uk.mathworks.com/help/deeplearning/ug/train-network-in-parallel-with-custom-training-loop.html will show you how to use minibatchqueue in conjunction with parallel syntax to partition your data and use data parallel training and inference.
minibatchqueue is a data iterator rather than a tool for managing the whole training loop like trainNetwork.