Memory leak occurring when repeatedly performing matrix left division on GPU
Afficher commentaires plus anciens
Im working on a project that at its core relies on peforming many matrix left division operations in a regression problem. In order improve the speed of the program I have been attempting to get this matrix left division to run on the GPU. However whenever I peform this operation my local memory begins to rapidly fill until the system runs out of memory and matlab crashes. The GPUs (NVIDIA GeForce RTX 3060 Laptop GPU) memory however is unaffected. The code that I'm trying to run is effectively as follows (this code causes memory leaks when run on my system):
A=gpuArray(rand(10000,12));
b=gpuArray(rand(10000,1));
for l=1:10000000000000
x=A\b;
end
It should also be noted that the GPU version of my code runs significantly slower than the CPU version.
Any help with this issue would be appreciated, thank you.
Réponse acceptée
Plus de réponses (0)
Catégories
En savoir plus sur GPU Computing dans Centre d'aide et File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!