How to force enable GPU usage in fitrgp

When i am using Regression learner app , and select 'Use Parallel' option for training, i can see my Nvidia GPU ( compute 7.2) being used.
But when i generate function from it and try to run from script, it wont, Can we set something in script to use GPU from script.
i tried Gpuarrays and tall array and both are not supported by fitrgp.
regressionGP = fitrgp(...
(X), ...
(Y), ...
'BasisFunction', 'constant', ...
'KernelFunction', 'exponential', ...
'Standardize', true,...
'OptimizeHyperparameters','auto',...
'HyperparameterOptimizationOptions',struct(...
'Verbose',1,...
'UseParallel',true));

3 commentaires

埃博拉酱
埃博拉酱 le 7 Avr 2023
Please correct your spelling and capitalization errors first. Too many typos will greatly discourage others from answering your question.
M N
M N le 8 Avr 2023
Modifié(e) : M N le 8 Avr 2023
is there somethings specific you are not able to understand?
Walter Roberson
Walter Roberson le 8 Avr 2023
In MATLAB Answers, each user can communicate in whatever language they feel most comfortable communicating in. If a reader has difficulty understanding, then the reader can ask for clarification of particular parts... or the reader can move on to other questions.
There is no requirement that people post in English -- and if they do post in English then it is fine if they used a machine translation that might get words or capitalization or contractions wrong compared to "perfect" English. We are here for Mathworks products, not for complaining about typographic mistakes.

Connectez-vous pour commenter.

 Réponse acceptée

Ive J
Ive J le 7 Avr 2023

0 votes

fitrgp does not [yet] support GPU arrays. You can easily scroll down the doc page and check "Extended Capabilities" for each function. UseParallel as the name suggests, will invoke parallel computations.

4 commentaires

M N
M N le 8 Avr 2023
Modifié(e) : M N le 8 Avr 2023
Thanks Ive, but my confusion is why app seems to use GPU even when GPU arrays and tall are not supported for fitrgp. is there something i am missing.
Also when i generate function from Regression learner app, that does not show setting "Use Parallel" in generated fucnction code for fitrgp.
When i run from matlab script, even when parallel pool is invoked, it will only use CPU and no GPU at all.
So i am thinking i am missing some setting in my script
No fitrgp does not even accept gpu arrays to begin with:
% X = randn(1e4, 5);
% y = randn(1e4, 1);
% mdl = fitrgp(gpuArray(X), gpuArray(y));
Error using RegressionGP.prepareData
The value of X must not be a gpuArray.
Error using gpuArray
Unable to find a supported GPU device. For more information on GPU support, see GPU Computing Requirements.
Also you may try, but Regression Learner doesn't even let you select gpyArrays as predictors or response. So, I'm not sure what you mean exactly by "app seems to use GPU". Of course, there are other functions implemented in the Regression Learner app that benefit from GPU computations, such as fitglm or fitlm, but you should use them in your own functions/scripts:
% X = randn(1e4, 5);
% y = randn(1e4, 1);
% mdl = fitlm(gpuArray(X), gpuArray(y));
M N
M N le 9 Avr 2023
Thanks for explaination, my bad, just in pursuit to get maximize usage of my GPU/CPU, i am thinkings stuff beyond documentation.
In screenhot below, when i run app, i see 7% GPU usage on my computer as soon as i hit train button on Regression Learner app.
So ledme to think that something in app or Matlab instance itself is trying to run on GPU itself, while running fitrgp. but i think its Regression learner's UI that is using the GPU not fitrgp, fitrgp seems to consume the CPU itself.
Is there a plan by Mathworks to add gpuarray support for fitrgp?
Ive J
Ive J le 9 Avr 2023
For that you need directly to contact TMW :-)

Connectez-vous pour commenter.

Plus de réponses (0)

Produits

Version

R2021b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by