How to force enable GPU usage in fitrgp
8 vues (au cours des 30 derniers jours)
Afficher commentaires plus anciens
When i am using Regression learner app , and select 'Use Parallel' option for training, i can see my Nvidia GPU ( compute 7.2) being used.
But when i generate function from it and try to run from script, it wont, Can we set something in script to use GPU from script.
i tried Gpuarrays and tall array and both are not supported by fitrgp.
regressionGP = fitrgp(...
(X), ...
(Y), ...
'BasisFunction', 'constant', ...
'KernelFunction', 'exponential', ...
'Standardize', true,...
'OptimizeHyperparameters','auto',...
'HyperparameterOptimizationOptions',struct(...
'Verbose',1,...
'UseParallel',true));
3 commentaires
Walter Roberson
le 8 Avr 2023
In MATLAB Answers, each user can communicate in whatever language they feel most comfortable communicating in. If a reader has difficulty understanding, then the reader can ask for clarification of particular parts... or the reader can move on to other questions.
There is no requirement that people post in English -- and if they do post in English then it is fine if they used a machine translation that might get words or capitalization or contractions wrong compared to "perfect" English. We are here for Mathworks products, not for complaining about typographic mistakes.
Réponse acceptée
Ive J
le 7 Avr 2023
fitrgp does not [yet] support GPU arrays. You can easily scroll down the doc page and check "Extended Capabilities" for each function. UseParallel as the name suggests, will invoke parallel computations.
4 commentaires
Plus de réponses (0)
Voir également
Catégories
En savoir plus sur Gaussian Process Regression dans Help Center et File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!