Running GLMs on GPU
1 vue (au cours des 30 derniers jours)
Afficher commentaires plus anciens
I was wondering if it was possible to generate generalized linear models from a gpu to speed up the process since that is a rate limiting step in my code - ends up taking about a week for many models.
training data is of the order [100x10]. Is it possible with a simple gpuArray ? if Not how do I run GLMs on GPU?
Thanks in advance (p.s. I have no idea how GPU programming is done)
0 commentaires
Réponses (0)
Voir également
Catégories
En savoir plus sur GPU Computing dans Help Center et File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!