Matlab 2018b GPU Training

1 vue (au cours des 30 derniers jours)
Eduardo Gaona Peña
Eduardo Gaona Peña le 6 Nov 2018
Till now I have been training an LSTM network using the 2018a version of Matlab and didnt have a problem using my GPU as training device. However, since I needed to change the activation functions of my LSTM layers I updated Matlab and now when I try to use my GPU it trains the network way slower than using the CPU, which doesnt make sense. For some reasong Matlab is not using my gpu's memory. Any ideas how to solve this?

Réponse acceptée

Joss Knight
Joss Knight le 7 Nov 2018
Modifié(e) : Joss Knight le 7 Nov 2018
Do you mean you switched to using hard-sigmoid or softsign activations? This is supported in 18b, but is a non-optimized version since it isn't supported by cuDNN, and is indeed much slower. I would recommend using the default activations for performance, if you can make it work.
  1 commentaire
Eduardo Gaona Peña
Eduardo Gaona Peña le 8 Nov 2018
Modifié(e) : Eduardo Gaona Peña le 8 Nov 2018
ahhhh ok ok. I was wondering why my CPU was doing it faster than my GPU. Sadly I have to use these activations functions as I want to implement the network in a drive and due to the limitations a hard-sigmoid is easier to calculate than an exponential. Luckly I only need about 1000 epochs. Thanks for the information anyway :D !

Connectez-vous pour commenter.

Plus de réponses (0)

Catégories

En savoir plus sur Image Data Workflows dans Help Center et File Exchange

Produits


Version

R2018b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by