efficient way to work with ~2k variables of size ~400x400x3

1 view (last 30 days)
Hello,
I am running an optimization algorithm that involves iterating over ~2k variables, each variable representing an image of size ~400x400x3. In my iterations, I need to load these variables, update them, and go back to the next round, something like this
for iter=1:100
for var_no=1:1225
% Load var1_i
load(['var1_', num2str(var_no), '.mat');
% Load var2_i
load(['var2_', num2str(var_no), '.mat');
% Update var1_i, var2_i
var1_i = update_var1(var1_i);
var2_i = update_var2(var2_i);
% Save them again
save(['var1_', num2str(var_no), '.mat', var1_i, '-v7.3');
save(['var2_', num2str(var_no), '.mat', var2_i, '-v7.3');
end
end
As it can be expected, my code is extremely time consuming, and the size and their no. being very large, I cannot keep them in RAM. Can somebody please help me out and suggest me an efficient way to carry this? It has become a huge bottleneck in my run time. Thanks so much!
  1 Comment
AA
AA on 30 Jan 2018
Hello, sorry I missed adding that changing the inner loop to parfor(as all the variables are independent) alleviates the problem to an extent, but still very time consuming due to the load/save operations

Sign in to comment.

Accepted Answer

Guillaume
Guillaume on 1 Feb 2018
"I cannot keep them in RAM"
Why not? 400x400x3x2000 is under 8 GB of memory as double. Don't you have that much memory available on a computer capable of running the parallel toolbox? If not, I'd really consider upgrading the memory.
Furthermore, since your processing images it's very possible that your images started as 8-bit integers, in which case keeping them all in memory would only require about 960 MB.
  5 Comments
Matt J
Matt J on 2 Feb 2018
What format did your images start as? If they were 8-bit rgb image (the most common format), then keeping 5000 images of size 700x700(x3 colour channels) as uint8 would only require 8GB of data.
This assumes, though, that the iterative update process can tolerate uint8 precision.
AA
AA on 3 Feb 2018
Hello Matt and Guillaume,
Thanks for all your help. I am able to store them and run them in parfor all at the same time. I simply stored the entire matrix as single precision variables (since my algorithm is not required to generate very precise results), and I could reduce the memory consumption by half. The performance is in acceptable range now.
Thanks again!

Sign in to comment.

More Answers (1)

Matt J
Matt J on 30 Jan 2018
You could gain some speed-up uing MATFILE to access the data instead of load/save.
  3 Comments
AA
AA on 1 Feb 2018
Dear Matt,
I tried that, but unfortunately, matfile obj access is not permitted inside parfor loops (my inner loop is parfor, which gives me some performance gain over using for). Do you have another way out?
Thanks!
Matt J
Matt J on 1 Feb 2018
Edited: Matt J on 1 Feb 2018
I don't see that limitation in the matfile documentation. Where did you read it? The following simple example worked fine for me,
a = 0;
save tst1 a
save tst2 a
save tst3 a
m{1}=matfile('tst1','Writable',true);
m{2}=matfile('tst2','Writable',true);
m{3}=matfile('tst3','Writable',true);
parfor i=1:3
m{i}.a=i;
end
load tst1; a
load tst2; a
load tst3; a

Sign in to comment.

Categories

Find more on Loops and Conditional Statements in Help Center and File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!