How to use GPU to speed up computation?
4 vues (au cours des 30 derniers jours)
Afficher commentaires plus anciens
% Define the paths to the three video files
videoPaths = {
'F:\LIVE Video Quality Challenge (VQC) Database\LIVE Video Quality Challenge (VQC) Database\Video\A002.mp4',
'F:\LIVE Video Quality Challenge (VQC) Database\LIVE Video Quality Challenge (VQC) Database\Video\A003.mp4',
'F:\LIVE Video Quality Challenge (VQC) Database\LIVE Video Quality Challenge (VQC) Database\Video\A004.mp4'
};
% Define the folder to save the Excel files
outputFolder = 'F:\FEUsingFriquee_csvFiles\';
% Loop over the selected video files
for videoIdx = 1:numel(videoPaths)
try
% Open the current video file
videoObj = VideoReader(videoPaths{videoIdx});
numFrames = videoObj.NumFrames;
% Initialize variables to store features
allFriqueeFeats = [];
% Create a parallel pool with 6 workers
if isempty(gcp('nocreate'))
parpool(6);
end
parfor frameIdx = 1:numFrames
% Read frame
frame = read(videoObj, frameIdx);
% Extract features for the current frame
friqueeFeats = extractFRIQUEEFeatures(frame);
% Check if the size of features matches the number of frames processed
if size(friqueeFeats.friqueeALL, 1) ~= 1
continue; % Skip frames with inconsistent feature sizes
end
% Store features for this frame
allFriqueeFeats = [allFriqueeFeats; friqueeFeats.friqueeALL];
end
% Organize features for output
videoFeatures.allFriqueeFeats = allFriqueeFeats;
% Define the filename for the current video
[~, videoName, ~] = fileparts(videoPaths{videoIdx});
filename = [videoName, '.xlsx'];
% Full file path for the current video
fullFilePath = fullfile(outputFolder, filename);
% Write the feature matrix to Excel file
writematrix(allFriqueeFeats, fullFilePath);
% Delete the parallel pool
delete(gcp);
catch ME
fprintf('Error processing video %d: %s\n', videoIdx, ME.message);
continue; % Continue with the next video file
end
end
I have to extract features for all frames of each video. the number of frames are 300 and features are 560. The time taken for each video is 8 hours or more and i have to extract feaatures of 500 videos. I have tried to create a pool and use parfor loop to utilize the GPU in order to reduce computation time. However, in task manager, GPU is not being utilized. I have NVIDIA GEFORCE 3050 RTX. How can i use GPU to reduce computation time ?
6 commentaires
Jonas
le 11 Mar 2024
do i see that correctly that processing 1 frame only, needs 180s, meaning 3min?
starting with this profile, you may have a look into those function with a big dark band. if you need assistance, you may post those function and explain, what they should do, with is input / output etc.
Réponses (1)
Catalytic
le 26 Fév 2024
Using the GPU in conjunction with parfor will not be productive unless you have a multi-GPU system and assign each parpool worker a different GPU. Otherwise, the workers will just compete for access to the same GPU bus.
1 commentaire
Voir également
Catégories
En savoir plus sur GPU Computing dans Help Center et File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!