Why Interactive MATLAB job require less memory compared to non-interctive job on cluster

2 vues (au cours des 30 derniers jours)
Fan Li
Fan Li le 12 Jan 2018
Commenté : Fan Li le 14 Jan 2018
Hi everyone
I am running matlab on school's cluster (linux system). The original data read into matlab is up to 4 GB, and there is also a array needs 24 GB for calculation in my code. I required 12 cores and 24 GB memory by this command (qsh -pe smp 12 -l h_vmemm=2 matlab=12) for Interactive MATLAB job on school's cluster. The job can run successfully.
However, I required 12 cores with 50 GB for non-interctive job, but it failed somewhere of my code. Then I increased the memory to 80 GB, it can run further.But it would stop as well. Even I used clear command to clear the big arrays, it did not work!
Can any one tell me what is wrong for the non-interctive job?
  2 commentaires
Kojiro Saito
Kojiro Saito le 13 Jan 2018
What a function do you use for non-interactive job? parfor, batch or spmd? One point is that there's a transparency concern in parfor so, please take a look at this document of Transparency in parfor.
Fan Li
Fan Li le 14 Jan 2018
Hi I just used the regular for function, and there is no batch or spmd in my code.Any other suggestions?

Connectez-vous pour commenter.

Réponses (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by