Memory usage very high
223 vues (au cours des 30 derniers jours)
Afficher commentaires plus anciens
I always have problems with matlab (R2019b) using too much memory (way more than the variables I have saved). Currently I'm running a function to extract data from a number of structures. I paused the function because the level of RAM being used just doesn't make any sense. Task manager says that Matlab is using 4.7gb of memory, even though I'm not running anything right now. The total size of all the variables in my workspace is ~0.055gb and I have no figure windows open. The only two programs I have running on my computer are Matlab and Task Manager. Is there any reason that Matlab would be using so much memory and is there a way for me to reduce it?
1 commentaire
Xin Niu
le 18 Avr 2024
It is not only the variables saved on your disk that take memory. I had a code that takes more than 300 GB of memory and finally I found that is due to a bug. The story is we used to save a 0-based stimstamps variable. In an update, we switched to unix time. A variable is created based on the max value of timestamps like: var = 1:step:max(timestamps). So the variable became huge when we switch to unix time.
Réponses (3)
Jan
le 22 Nov 2019
Modifié(e) : Jan
le 29 Déc 2021
How do you observe the memory consumption? The Taskmanager displays the memory reserved for Matlab. If Matlab allocates memory and releases it afterwards, this is not necessarily free'd directly. As long as no other application asks for the memory, it is efficient to keep its status.
Does it cause any troubles, that the OS reserves 4.7GB RAM for Matlab? Why to you say, that this is "too much" memory?
Although the current consumption of memory is small, growing arrays can need much more memory. Example:
x = [];
for k = 1:1e6
x(k) = k;
end
Although the final array x occupies 8MB of RAM only (plus about 100 Bytes for the header), the intermediate need fro RAM is much higher: sum(1:1e6)*8 bytes = 4 TerraBytes. Explanation: If x=[1], the next step x(2)=2 duplicates the former array and appends a new element. Although the intermediately used memory is released, there is no guranateed time limit for the freeing.
Can you post some code, which reproduces the problem.
12 commentaires
Walter Roberson
le 30 Déc 2021
"Do you mean the clear commands make the code slower?"
If the variable is reused, then yes , "clear" makes the code slower. MATLAB does flow control analysis to track potential type changes in variables. When "clear" is used, then in all points after that in the code, MATLAB has to mark the variable as being of unknown type and re-look-up methods for it each time (because the flow control analysis is static, and at run-time MATLAB has to face the possibility that the variable might not be re-defined in some paths.)
Bruno Luong
le 30 Déc 2021
"When "clear" is used, then in all points after that in the code, MATLAB has to mark the variable as being of unknown type and re-look-up methods for it each time "
It does not make sense to me. The analyser should be able to know the type of variable when it will be create again after clear.
And might be the analyzer won't be able to akways track the type without clear, for instant the branch decision depending on execution value, or burried deep inside class method or function.
Jose Sanchez
le 28 Jan 2020
I am having a similiar issue while running on an HPC cluster.
My University cluster allow me using up to 520 workers where each HPC node (4 workers) has 8 GB RAM. I controlled that the RAM consumed inside my parfor loop were no higher than 500 MB. However, when I run in the cluster using 100 parallel processes, the cluster crash with "Out of Memory" error.
Then, I did a test running locally on my PC (32 GB RAM) and I can see clearly that every worker is consuming over 2 GB of RAM, which is more than 5 times the amount of RAM consumed within each PARFOR.
In my opinion, clearly, MATLAB is doing something that is not working as expected! I didn't notice this issue in the HPC MATLAB version 2017a despite using our cluster very often.
4 commentaires
Mohammad Sami
le 20 Mai 2020
R2020a introduced new thread based parallel cluster. It has some limitations compared with the process based cluster. If your code is compatible with the thread based cluster, you can use it instead to reduce the memory overhead for process based clusters.
You can read more details here
tianyuan wang
le 30 Juil 2020
I have the same problem.If I call MATLAB on the control node of HPC (one control node, 17 computing nodes, distributed memory) to process the super large matrix, can the new version of MATLAB solve the problem of insufficient memory for single node?
Christian Schwermer
le 16 Août 2020
Hello,
MATLAB doesn't release memory, if you didn't declare the variable as output for the function.
best regards
4 commentaires
Bruno Luong
le 19 Août 2020
"I think every variable has to be preallocated exactly. Changing size while a function is executing, accumulates memory usage. I think when the size changes the complete array is copied and saved, without changing its name. Original array stays in memory, and therere is no possibility to delte, because its name doesn't exist anymore."
Hmmm no, sorry but this in totally incorrect, MATLAB is not that stupid. It has cross link within internal structure to keep track data sharing, release memory update the cross link if that variable is cleared, or erased when content is written, there is also garbage collector that destroy local variables when function ends, etc... The user variable name has nothing to do with the management.
The latest MATLAB version might not work exactly like that but there is no leak memory as you state.
Voir également
Catégories
En savoir plus sur Startup and Shutdown dans Help Center et File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!