saving variables in a single .mat file

3 views (last 30 days)
Hello,
I have 360 .mat files containing same variable in with different data (row vectors) each of size in(1x3800000) stored in them. They are of size 9.84GB (all 360 files).
Now I want to save them all in 1 .mat file as a matrix out(360x3800000).
How can I do it?
  2 Comments
Daniel Shub
Daniel Shub on 26 Mar 2012
What problems are you running into?
zozo
zozo on 26 Mar 2012
I need to extract some data from each of these 360 files(row vectors) and do further processing. So, I do not want to load them 1 by 1,extract data, load again and so on. Having them all together in a cell/array makes it far easier.

Sign in to comment.

Accepted Answer

ndiaye bara
ndiaye bara on 26 Mar 2012
Try this code m=zeros(3800000,360); for k=1:360, eval(sprintf('load F*_%d.mat',k)); % F*=name of all your files .mat% eval(sprintf('y=F*_%d(:,1);',k)); disp(k); clear F* if k==1, m=y; else m=[m,y]; end end
save File m t %save the new file .mat
  4 Comments
zozo
zozo on 27 Mar 2012
Same variable 'in' was saved each time, but with different file names as data(1).mat,data(2).mat,data(3).mat.......data(360).mat
Please suggest the syntax for my case.
Jan
Jan on 27 Mar 2012
Dear zozo, accepting an answer means, that it solves your problem.
Daniel and I have warned you that you cannot load a 11GB of data (360*3800000*8 byte per double) efficiently, if you have only 4GB of RAM. I assume you need 32GB RAM to work efficiently with such large data, 64GB is safer.
The above EVAL approach is cruel.
You currently did not specify in which format you want to store the data, DOUBLEs or SINGLEs, an integer type, as cell or matrix. Anyhow, I'm convinced, that it is the wrong approach due to the limited memory.

Sign in to comment.

More Answers (2)

Jan
Jan on 26 Mar 2012
Do you have a 64-bit Matlab version? How many RAM do you have installed? Do you want to store the values in one 360 x 3'800'000 array, a most likely more useful 3'800'000 x 360 array, of as separate vectors e.g. in a {1 x 360} cell. The later has the advantage, that it does not need a contiguos free block of memory.
  4 Comments
Jan
Jan on 26 Mar 2012
4GB RAM is very lean for such a big chunk of data. If it is really necessary to keep all values in the RAM simultaneously, buy more RAM. Implementing workarounds to process the data in pieces will be more expensive.
Jan
Jan on 26 Mar 2012
@Siva: Does you comment concern the current topic? If so, please explain the connection. If not, please delete the comment and post it as a new question - with more details. Thanks.

Sign in to comment.


Daniel Shub
Daniel Shub on 26 Mar 2012
In a comment to Jan you say you have 4 GB of RAM. Loading 9+ GB of data is going to bring your computer to a screeching halt.
Try and create an array of the required size and see what happens ...
x = randn(360, 38000000);
  2 Comments
zozo
zozo on 26 Mar 2012
Yes, I think I will load 7x(50x3800000)+10 at a time. But Iam having problem loading them into 1 file as suggested by @ndiaye
Daniel Shub
Daniel Shub on 27 Mar 2012
Why? Nobody wants a 1+GB data file. Leave the files small and load them as needed. I doubt there is much of a benefit of doing a single huge load.

Sign in to comment.

Categories

Find more on Variables in Help Center and File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!