accelerate imread on large images

hi
is there a way to accelerate imread on large (> 2GB) images?
thanks a lot,
mat

3 commentaires

KALYAN ACHARJYA
KALYAN ACHARJYA le 28 Avr 2018
Do you want to call multiple images?
Ameer Hamza
Ameer Hamza le 28 Avr 2018
Is 2GB size of a single image, or several images together make up 2GB?
mat
mat le 29 Avr 2018
hello Ameer,
just 1 image.
thanks again,
mat

Connectez-vous pour commenter.

Réponses (2)

Walter Roberson
Walter Roberson le 29 Avr 2018

0 votes

Not in general.
If the images are TIFF images, then you could get more flexibility about how you read by using the Tiff() class. This will not necessarily reduce total time to read an image, but if it turned out that you only needed part of the image then it would permit you to read it in pieces if it was stored the right way.
If you are using blockproc() then there is an option to process a file instead of a matrix. blockproc() will take advantage of any facilities offered by the type of image to read only a section at a time. In practice this probably means special handling for TIFF but not for much else -- though in theory it might also be able to handle some JPEG images in sections.

4 commentaires

mat
mat le 29 Avr 2018
thanks walter. i am reading tiff files and indeed am aware of the Tiff() class option for strip reading. unfortunately, i need the full content of the files and a strip by strip reading would only slow the process down. thanks again. mat
Walter Roberson
Walter Roberson le 29 Avr 2018
Probably the best speedup would be to use an SSD.
If you have the Parallel Computing toolbox, you could experiment with having different workers read different sections of the file and then merge them together. That would only really be useful for the situation in which the decompression was significantly slower than the reading from disk, which I doubt to be the case in practice. In the more typical situation where the controller bandwidth is the limiting factor, then adding more readers just slows everything down due to contention for access to the scarce resource (disk bandwidth.)
mat
mat le 30 Avr 2018
hi walter, unfortunately i don't have the said toolbox but i really appreciate all your comments !! thanks !
Walter Roberson
Walter Roberson le 30 Avr 2018
You can get an estimate for how long the decoding takes by using timeit() on imread and subtract off timeit of fileread() of the tiff. fileread is not necessarily the fastest possible way to read but it is relatively low overhead.
Just be sure to repeat the tests to account for the fact that the OS might be reading the file into cache, so the first access might be slow.

Connectez-vous pour commenter.

mat
mat le 30 Avr 2018

0 votes

timed both on a 4.8gb rgb tiff file. imread - 75sec fileread - 0.3sec !! - resulting in a much smaller size vector ofcourse a lot of decoding and formatting going on..
if only i could efficiently reconstruct the tiff file from the this fileread character vector..

2 commentaires

Walter Roberson
Walter Roberson le 30 Avr 2018
Perhaps you could try something like https://blog.idrsolutions.com/2015/08/how-to-read-tiff-images-in-java/ to see how the performance goes?
mat
mat le 30 Avr 2018
good point, i shall try this. tx.

Connectez-vous pour commenter.

Tags

Question posée :

mat
le 26 Avr 2018

Commenté :

mat
le 30 Avr 2018

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by