out of memory error for large data

I am working with a 50000*2048 matrix, but I want to do the simulation only once and save the data in a h5 file, unfortunately I am having memory problems and I can't save or load the data from the h5 file. Is there any solution to store large data ?

9 Comments

Is it possible to do your processing in tiles, or do you need to operate on the entire matrix at once? For example, you could break it up into 50 1000*2048 matrices and load only the sections you need by specifiying the start and stride in h5read.
I wish I could do it but I need the entire matrix to do its abs()^2 and plot it with respect to other variables.
That array is not even 1 gigabyte? It is not clear where the memory problem is coming from?
It is more than 1 Gb.
I tried to save the data in a .mat file and I could load it, but when I proceed in the program to do the remaining I get a memory problem.
It doesn't seem like you'd need the entire matrix for abs()^2. The only way those operations make sense are if they're performed elementwise (since, your matrix isn't square, the ^2 operation isn't well-defined as matrix multiplication). However, you might see if tall arrays solve your memory trouble satisfactorily - you'll just need to make sure all of the functions you use in MATLAB as compatible.
50 * 1000 * 2048 * 8
ans = 819200000
ans / 10^9
ans = 0.8192
Not even 1 gigabyte. Not unless your matrices are complex-valued or are not numeric.
yes it is complex
Okay, so double the storage for complex, still gets you 1.7 gigabytes at most. That would not be a problem unless you have quite a small memory (such as 4 gigabytes)
I will try to solve it thank you

Sign in to comment.

Products

Asked:

on 19 Apr 2021

Answered:

on 7 May 2021

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!