Matlab doesn't release memory when variables are cleared

I am working with very large data sets (~50 GB) in matlab R2015a. When I clear variables, matlab frequently does not release the memory. I have tried clear all, clear classes, clear java and running the java garbage collector. The only way I can get matlab to release the memory is close and restart matlab. Is there a better way?

9 commentaires

How you concluding that matlab is not releasing memory after clear ?
I am curious. What sort of data is this? Do you actually need all 50GB or just part of it? Pre-processing might help you in case of the later. Can you actually have 50GB in memory? If yes, that's an impressive computer.
Where are your variables stored? e.g. if they are stored somewhere under a GUI, such as on the handles structure, then clear all will not clear them as they are not in the workspace of the clear all (assuming you type 'clear all' at the command line. Doing this in the middle of program code is a very bad idea!)
I'm looking at the total physical memory usage in the Windows task manager.
And what sort of data are these and what are you doing with them?
Adam, The data are in the workspace, although I do use the data in a variety of GUIs. I have this problem even though I have closed all GUIs and figures that have used the data. I've had the problem even with data that has never been used in a GUI or a figure. I've tried clearing individual variables. Sometimes the memory is released when I clear the variables and sometimes it isn't. If the memory isn't released when I clear the individual variable, it isn't released when I clear all.
I'm working with large hyper-spectral images. I need to do a number of consecutive processing steps on the images. I keep having to save the data, shut down matlab, and reload because of out of memory errors. Its very frustrating. If I can't find a solution, I will have to move away from matlab.
Did you experiment with
pack
?
you can write a script to close matlab and continue with the next function in a new file, after saving variables to disk. Not ideal for sure, but it seems that eg. even shutting down parallel workers does not release a substantial chunk of ram, and that fully closing is indeed the only option here..

Connectez-vous pour commenter.

 Réponse acceptée

dpb
dpb le 13 Déc 2016
It has more to do with Windows and how its memory-management routines work (or not) regarding what memory that is marked as unused by the application is actually physically released and when. Also, even though there may be sufficient total free memory, it is free contiguous memory that is limiting when creating arrays; if there isn't sufficient for the job, you're stuck.
There are guidelines to help and also some newer techniques more recently introduced you can try--

Plus de réponses (4)

You can force clear a variable from memory by setting it to empty rather than calling clear on it.
x = [];
v.
clear x

2 commentaires

I have tried both setting the variables to [] and clearing and have the same problem.
dpb
dpb le 15 Déc 2016
Modifié(e) : dpb le 15 Déc 2016
They're the same; all an application can do is mark the memory as "unused"; it's up to the OS to reclaim it; see the above.
In the olden days in FORTRAN with nothing but static memory allocation, the "trick" was to allocate a very large chunk of memory and then use it by subscripting within it judiciously.
Perhaps reusing existing memory from one of your processing step to the next would be a similar possibility here of assigning the output of the step to the same variable as previously used. Matlab may still need to make copies if it can't tell that memory can be overwritten safely, so it may not help; there again may not be sufficient contiguous memory for the temporary, but it's a tactic you could possibly try.
Alternatively, perhaps it's time for mex files or, if you were able to illustrate specifically what your processing steps are, perhaps with some additional background others may have more efficient processing ideas.

Connectez-vous pour commenter.

Bonnie mentioned that clear all, clear classes, etc didn't work but what worked for me was using:
clearvars -global
This immediately reduced memory devoted to Matlab from 3.2 gig to 0.7 gig. In my case, one or two GUIs that were closed were still occupying a lot of memory.
Previously, Ive just done something this which seems to work.....
% After deleting your large variable, go....
evalin('base','save(''myVars'')');
evalin('base','clear');
evalin('base','load(''myVars'')');
Christian Schwermer
Christian Schwermer le 16 Août 2020
Modifié(e) : Christian Schwermer le 19 Août 2020
Hello,
i had a similar problem in my GUI, where i used a cell array as FIFO buffer to acquire images. Memory usage increases for every session. Only closing matlab and restart releases the memory usage:
bufferSize = 450;
frame_buffer = cell(1, bufferSize);
....
flushdata(VideoInputObj)
delete(VideoInputObj)
frame_buffer(:) = {[]}=;
clear('frame_buffer')
imaqreset
When i preallocate the buffer for each cell . Memory usage stays on a constant acceptable level. Nevertheless it wasn't possible to release memory without restarting:
ROI = VideoInputObj.ROIPosition;
bufferSize = 450;
frame_buffer = cell(1, bufferSize);
frame_buffer(:) = {zeros(ROI(4), ROI(3) ,'uint8')} ;

Catégories

En savoir plus sur Performance and Memory dans Centre d'aide et File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by