textscan: Instantaneous out of memory error when accessing very large file (only with newest Matlab versions)

9 vues (au cours des 30 derniers jours)
I am working with a very large dataset (total 500GB) that is split up into more than a thousand individual .txt files (160 columns/characteristics per file, more than a million rows possible, contains a mixture of string and numeric variables), each covering a specific geo area. For files covering large areas, a single .txt file can be as large as 16GB. To cope with the large amount of data, I proceed as follows for each of the files:
  • access the respective .txt file (with "fopen")
  • within a while-loop import 250,000 rows using "textscan"
  • process data and export smaller dataset (append if not first loop iteration)
  • repeat steps above until end of the .txt file is reached (while ~feof)
The code that imports the data looks like this:
fileID = fopen(filename) ; % Identify file name, "filename" is the path of the current .txt file
while ~feof(fileID) % import stacks of 250,000 rows until the end of the current .txt file is reached
% Import data, "format varlist" identifies format and columns to
% import. Delimiter is "|".
Data= textscan(fileID,strcat(char(format_varlist),'\r\n'),250000,'Delimiter','|',...
'HeaderLines', double(first_iteration==1),'EndOfLine','\r\n','EmptyValue',-1) ;
end
Doing so allows me to effectively reduce the size of my dataset such that I can conveniently work with the full dataset later.
My problem is the following: The code works perfectly well for ALL files (including the very large .txt ones) with version 2019a. With version 2021a (if I recall correctly, it did not work with 2020a either), the code works perfectly well UNTIL the code reaches a file that is too large. At this point, the code (instantaneously!) stops with an "out of memory" error:
Out of memory.
Related documentation
I suspect that the newer "textscan" function recognizes the filesize that is to be accessed would be too large to load in fully (which it is), but does not recognize that I only want 250k lines at a time.
I looked at the "readtable" command, but as far as I know, this command does not allow to import smaller stacks of data once at a time (only for spreadsheets).
Is there a workaround/fix for my issue? As I work (and worked) frequenty with these types of codes, I would be eternally stuck with the 2019 version. Thank you very much in advance for your help.
  3 commentaires
Simon Stehle
Simon Stehle le 7 Mai 2021
Thanks a lot for the hints, I was hoping to avoid to completely restructure my code, but will definitely look into the datastore methods.
As for the cuts in chunk size, I have tried any small number. The error message still pops up immediately, which is why I am almost sure that the function "does not even try".
dpb
dpb le 7 Mai 2021
And, textscan is fully builtin so not even the preliminaries are able to be looked at to see what it might do in that regards.
If it indeed won't work at all, I'd say that qualifies as a bug and is in total violation of documented behavior.

Connectez-vous pour commenter.

Réponse acceptée

Walter Roberson
Walter Roberson le 12 Mai 2021
specify the encoding on your fopen() so that the i/o library does not have to read through the entire file to ddetermine the encoding. The default now is to automatically detect but that can require reading the entire file to disprove the hypothesis that the file might contain utf8.
  6 commentaires
Walter Roberson
Walter Roberson le 12 Mai 2021
R2020a rather than R2019b.
File Encoding: Save MATLAB code files (.m) and other plain text files as UTF-8 encoded files by default
[...] When opening existing files, the Editor and other functions like type or fopen automatically determine the current encoding.
dpb
dpb le 12 Mai 2021
Modifié(e) : dpb le 13 Mai 2021
Huh. But it never even hints that this feature can cause an out-of-memory error in any of the documentation. That should definitely be highlighted along with the above description and fopen ought to be able to report the problem specifically as to the cause and fix instead of just dumping the "out-of-memory" standard error message.

Connectez-vous pour commenter.

Plus de réponses (2)

Shiva Kalyan Diwakaruni
Shiva Kalyan Diwakaruni le 12 Mai 2021
Hi,
Refer to Memory Usage information located at the following URL:
Specifically to the sections:
1. Strategies for Efficient Use of Memory
2. Resolving "Out of Memory" Errors
Concepts:
1. Memory Allocation
2. Memory Management Functions
Some additional resources for resolving "Out of Memory" errors:
Hope it helps.

Steven Lord
Steven Lord le 13 Mai 2021
I recommend you look at the functionality in MATLAB to process large files and big data. The approach you've described sounds like you could use a tall array backed by a tabularTextDatastore.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by