Reducing running time in image processing
5 vues (au cours des 30 derniers jours)
Afficher commentaires plus anciens
I wrote a code to find circles in an image by using imfindcircles and do some other calculations on the detected circles. I plan to apply the code to 250000 images. My current code takes 0.8 seconds per image. Processing of each image is completely independent from other images. I am aware of parfor commands but I do my best not to use it because my code is complex enough and I do not like to make it more complex. Is there any way that I can run the script in a parallel way to reduce the total time (and not the running time for each which is 0.8 seoconds)? It should be noted that in some parts of the code I take advantage of GPU as well.
0 commentaires
Réponse acceptée
Walter Roberson
le 31 Août 2013
parfor() and related commands such as spmd() are the main approach. Otherwise, especially if you are on Linux or OS-X, run a script that hives off a number of different MATLAB processes, each with slightly different parameters. Though if you are keeping a GPU busy, it is not certain that running multiple such routines would be any faster.
The usual method is to (A) optimize the algorithm; and (B) vectorize the code.
0 commentaires
Plus de réponses (0)
Voir également
Catégories
En savoir plus sur Big Data Processing dans Help Center et File Exchange
Produits
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!