Why Interactive MATLAB job require less memory compared to non-interctive job on cluster

Hi everyone
I am running matlab on school's cluster (linux system). The original data read into matlab is up to 4 GB, and there is also a array needs 24 GB for calculation in my code. I required 12 cores and 24 GB memory by this command (qsh -pe smp 12 -l h_vmemm=2 matlab=12) for Interactive MATLAB job on school's cluster. The job can run successfully.
However, I required 12 cores with 50 GB for non-interctive job, but it failed somewhere of my code. Then I increased the memory to 80 GB, it can run further.But it would stop as well. Even I used clear command to clear the big arrays, it did not work!
Can any one tell me what is wrong for the non-interctive job?

2 Comments

What a function do you use for non-interactive job? parfor, batch or spmd? One point is that there's a transparency concern in parfor so, please take a look at this document of Transparency in parfor.
Hi I just used the regular for function, and there is no batch or spmd in my code.Any other suggestions?

Sign in to comment.

Answers (0)

Categories

Asked:

on 12 Jan 2018

Commented:

on 14 Jan 2018

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!