How to completely reset Matlab's workspace?

How can I reset Matlab's workspace (environment) as if I restarted it without actually restarting it (to avoid overhead)? Perhaps specifically to trigger garbage collection?
This is in order to resolve the problem of Matlab degrading and become slower over time, probably due to poor memory management:

3 Comments

Isn't this exactly what
clear
does?
clear all
will also remove functions and classes from the cache which may further help to trigger garbage collection (for class objects). It may not though.
As I mention in the second link, running the following every iteration doesn't help:
clear classes; clear functions; dbclear all; clear all; pack;

Sign in to comment.

Answers (1)

Jan
Jan on 19 Sep 2017
Edited: Jan on 19 Sep 2017

8 Comments

Zohar
Zohar on 20 Sep 2017
Edited: Zohar on 20 Sep 2017
Currently, I'm running Matlab code from C++ API. Similarly I restart Matlab. It just doesn't feel so clean, and Matlab takes time to load. Not sure why isn't there a command just to reset it. Moreover, note that if I simply ran a script like in my first link, then your solution wouldn't have worked. I would have needed to save the state of the script before restarting, and automatically continuing running it. A hassle.
save data.mat
system('matlab -r yourscript.m &')
exit
Now include load('dat.mat') in your script and everything is reset clearly and perfectly and restarted. So what does let you think, that "is just doesn't feel so clean"? This is working, not a hassle.
Dear @Jan,
I am following your recommendation, but this does not seem to be working on Linux. I have R2017a on Ubuntu16.04. Removing the '&' does launch a new window but retains the first as expected. Is there any way out you know?
Thanks!
system('nohup matlab -r yourscript.m 2>&1 &')
When i run
system('matlab &')
exit
in Mac OS it just closes MATLAB. Any solution?
The nohup version I posted should prevent that problem.
I'm coming back for more.
I'm using parallel processing to analyze many images at once. If I run the code once, everything is fine, but as soon as I run it a second time, I get an error saying 'Out of memory'. This is annoying because technically I have to exit and restart MatLab every time. Moreover, the updated solution Mr. Robertson has provided doesn't seem to work for me. Please also note that I boosted my laptop's RAM to 32Go, 3200MHz (up from 8Go, 2667MHz) expressively for the purpose of running parallel processing.
This means I need to reconnect to multiple workers (one for each physical core I want to use) every time I need to restart the code. Granted, it's still WAAAAAAAAAY faster than analyzing sequentially (which, for me, took more than an hour for five measly images, compared to about 7 minutes for 5 cores) but still it feels unpolished. Any solution to that would be greatly appreciated!
Thanks in advance! :)
LP
I suggest using the (newer) -batch option to matlab.

Sign in to comment.

Categories

Products

Tags

Asked:

on 19 Sep 2017

Commented:

on 2 Sep 2022

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!