Hi,
why does the following fail? I cannot find any caveat about GPU support of min/max on the documentation. Plus, the error message is really not helping. Maybe it has been solved in R2020?
Is there a way to obtain the linear index of the min/max without fecthing the array from the GPU ?
>> A = rand(3, 3, 3, 'gpuArray');
>> min(A, [], 'all')
ans =
0.0342
>> min(A, [], 'linear')
Error using gpuArray/min
Option must be 'all', 'linear', 'omitnan', or 'includenan'.
>> [minimum, index] = min(A, [], 'linear');
Error using gpuArray/min
Option must be 'all', 'linear', 'omitnan', or 'includenan'.
Thanks!

 Réponse acceptée

Matt J
Matt J le 10 Oct 2020
Modifié(e) : Matt J le 10 Oct 2020

1 vote

I cannot find any caveat about GPU support of min/max on the documentation.
Check the "Extended Capabilities" section of doc min.
Maybe it has been solved in R2020?
Yes, it has.
Is there a way to obtain the linear index of the min/max without fecthing the array from the GPU ?
[minimum, index] = min(A(:));
or to operate along a specific dimension, use the attached function minlidx,
[minimum, index] = minlidx(A,dim);

Plus de réponses (0)

Catégories

En savoir plus sur Linear Algebra dans Centre d'aide et File Exchange

Produits

Version

R2019a

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by