How to speed up loading of .mat files

I have around 200K .mat files which I need to analyze. It will take me a lot of time if I load each file to access a particular field of interest. I'll highly appreciate your good advice.

Réponses (3)

David
David le 16 Août 2023
Modifié(e) : David le 16 Août 2023
If the mat files are:
  • large
  • have a lot of variables or nested variables or structs, most of which you dont need
  • saved as version 7.3 or later
Then it might be worth bypassing 'load' entirely and taking advantage of the fact that .mat v7.3 is just a HDF5.
Load the variable inside the files you want directly, without bothering with the variables you dont want to load. Itll load insanely fast, regardless of size.
Say you have a .mat file with path/to/my/file.mat with variables 'var1', 'var2', 'var3.a.b.c.d', and you just want var2 .
myVarName = 'var2';
myFile = fullfile('path','to','my','file.mat');
function argOut = quickLoad(myFile, myVarName)
% Get the location of the variable in the file using hdf syntax
% / by itself is the root of the file, then variables names come after
% Note: also works very nicely for nested structures where a.b.c.d.e would
% have varName as /a/b/c/d/e.
h5loc = ['/' myVarName]; % Always /, not like windows/linux filesep
% Open the file using H5F.
fid = H5F.open(pathToMatfile);
% Open the file using H5D.
dsetid = H5D.open(fid,h5loc);
% Load in the dataset
argOut = H5D.read(dsetid,h5Loc); % All done
% Clean up
H5D.close(dsetid);
H5F.close(fid);
Should include some input checking (existence of the variable without using 'whos' which is very slow), but that can be another post.
Try it out... itll make you happy.
myData = quickLoad(myFile, myVarName)

2 commentaires

David
David le 16 Août 2023
Check if this file is actually v7.3 using:
tf = H5F.is_hdf(myFilePath);
Can check existence of the variable or dataset in another thread.
David
David le 16 Août 2023
This will also work for nested structures, which can be handy:
myVarName = 'var3/a/b/c/d';
d = quickLoad(myFile, myVarName);

Connectez-vous pour commenter.

Jan
Jan le 14 Mar 2017

0 votes

Store the files on a SSD.

7 commentaires

Faisal Ahmed
Faisal Ahmed le 14 Mar 2017
Seriously??????? Do u think I have all these files on a floppy disk ;)
Jan
Jan le 14 Mar 2017
Modifié(e) : Jan le 14 Mar 2017
Sorry, Faisal. Yes, seriously. Reading MAT-files is imlemented efficiently already. You cannot assume that there is a magic 'faster' flag for the load command. So all you can do is either using matfile to extract the wanted field only (ups? Walter has suggested this, but his answer has vanished now? Strange.) or to accelerate the physical process of reading. So I really and seriously suggest to use a SSD as a physically efficient drive. While I do not assume, that you use floppy disks, I saw users, who access files from cheap USB sticks or on network drives.
It matters, if the files are written as -v7.3 MAT files or in former versions. But I assume, that the files are existing already, such that changes are not an option anymore.
I do not assume that this answer satisfy you, but it is the best solution I know.
Faisal Ahmed
Faisal Ahmed le 15 Mar 2017
You don't need to be sorry I said that on a lighter note. Your suggestion has its own value. Thanks
Jan
Jan le 15 Mar 2017
Then perhaps I should dare to suggest to buy a faster computer? :-)
Parallel processing might help: While one thread analyses the data, the other accesses the disk.
Andre Kühne
Andre Kühne le 15 Sep 2017
.mat file access has been a major bottleneck for me for quite some time now. Matlab on my system is currently reading a 25 GB .mat file with a speed of around 17 MB/s from an SSD capable of 500 MB/s. For files of that size, you have to use the -v7.3 switch which force-enables a very slow compression/decompression. I have only one (complex single) variable stored in the file, not a complicated structure. Saving the file is just as bad. Matlab 2017a allows deactivation of the compression, but even that helps only marginally and sometimes seems to slow loading even further. I have to agree with Faisal - at least in some scenarios .mat I/O is absolutely sluggish. Any advice on how to speed this up is appreciated.
Steven Lord
Steven Lord le 15 Sep 2017
You may be able to use the matfile function to access your data on disk.
@Andre: What does "Saving the file is just as bad" exactly mean? Storing a single array might be more efficient with fwrite in a binary format.
I thought of publishing an alternative save command, which uses 7zip (optimize for output size and time for reading, but slow writing) or minilzo (fast, but no powerful compression) for a compression. Unfortunately the details are critical: nested struct arrays containing function handles and user-defined objects, brrr. I cannot decide if I should implement a feature for extracting parts of the file (some variables or a slices of large arrays). It is easy to optimize such a tool for a specific purpose, but then it can never compete with the established, flexible and massively tested MAT format. Therefore I still use binary files without compression.
x = rand(1, 2e8); % 1.6GB data
tic;
f = fopen('test.dat', 'w');
fwrite(f, x, 'double');
fclose(f);
toc
Elapsed time is 7.466022 seconds.
About 210 MB/s with an old hard disk.

Connectez-vous pour commenter.

Catégories

Tags

Commenté :

le 16 Août 2023

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by