Not using GPU for trainNetwork?
Afficher commentaires plus anciens
Hello,
I am looking to train a network using the trainNetwork command. I have set up the network, options, and data. I have installed the Parallel Processing Toolbox and my GPU is NVIDIA QuadroM1000M with compute capability 5.0 (which should be enough compute capability per https://www.mathworks.com/help/parallel-computing/gpu-support-by-release.html). It was suggested in the Matlab Deep Learning Onramp that the GPU would be automatically used if I had the processing toolbox and my GPU was compatible. However, when running trainNetwork() it does not use the GPU. Using code gpuDeviceTable returns nothing. Does this suggest my GPU actually is not compatible or is there some other way I can access it?
Thank you
Réponse acceptée
Plus de réponses (1)
yanqi liu
le 23 Mar 2022
0 votes
yes,sir,may be use
>> gpuDevice
to check your ExecutionEnvironment
or in train option,set
'ExecutionEnvironment','gpu'
'ExecutionEnvironment','cpu'
to make the device type when training
Catégories
En savoir plus sur Parallel and Cloud dans Centre d'aide et File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!