Can I GEFORCE GTX 960M for deeplearning
13 views (last 30 days)
Show older comments
Negar Noorizadeh
on 14 May 2019
Answered: Shivam Sardana
on 22 May 2019
Dear All,
When I check my NVIDIA control panel, it shows that my laptop has GEFORCE GTX 960M GPU, I am not sure I can use it during deep learning or not?
if yes, I only need to set the 'execution environment' to GPU in training option, right?or is there any other setting that I should do?
Reagrds
2 Comments
Walter Roberson
on 14 May 2019
Yes the Cuda 5.0 capacity is supported. Memory will be a bit tight for it. Install the latest Nvidia drivers.
Accepted Answer
Shivam Sardana
on 22 May 2019
Considering CUDA and cuDNN installed. To access and get information about GPU, run the following command:
GpuDevice
GPU, multi-GPU, and parallel options require Parallel Computing Toolbox. To use a GPU for deep learning, you must also have a CUDA® enabled NVIDIA® GPU with compute capability 3.0 or higher.
To use GPU for deep learning, set 'ExecutionEnvironment' in ‘trainingOptions’ as 'gpu'. By default, ‘trainingOptions’ take GPU by default.
Hope this helps.
0 Comments
More Answers (1)
Prathamesh Degwekar
on 21 May 2019
Hi,
According to the current scenario, DCH is the way to go as seen in the article from intel here:
On the other question, you can go with either of the two. Their difference can be found here.
0 Comments
See Also
Categories
Find more on GPU Computing in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!