# out of memory error

2 views (last 30 days)
Zahra Kamali on 20 Jun 2022
Commented: Sam Chak on 22 Jun 2022
hello,
I wonder if you could help me with my codes please
I need to make a big matrix (e.g. T= zeros(1e8,27)) but get out of memory error and I cant go further 9.5e7.
however, I need to go even e20. what would be the best way to solve this issue.
thanks
Sam Chak on 22 Jun 2022
How long did your code take to run in R2022a?
Unsure if it works, but you can try something like this
T= sparse(9.5e4, node_num);
[p, n] = size(T);
G = p/100;
TT = sparse(G, n);

John D'Errico on 20 Jun 2022
Edited: John D'Errico on 20 Jun 2022
Um, 1e20 rows? Seriously? Do you have that much memory? Clearly not, since your computer is throwing up at an 1e8 by 27 array. Even that array requires a large chunk of RAM. Remember that every double precision element uses 8 bytes of memory.
1e8*27*8/1e9
ans = 21.6000
So the simple array you tried to allocate needs roughly 21.6 gigabytes of RAM. While I have twice that much on my computer, even mine would probably fail to work with such an array, because almost anything you do will often make copies for whatever purpose you need. A simple rule of thumb is to never make an array that is more than say 1/3 of the RAM you have.
But to make arrays that are significantly larger than that? BE SERIOUS. Do you have
1e20*27*8/1e9
ans = 2.1600e+13
So 21 trillion gigabytes of RAM? Again, to be careful, you probably at least want 3 times that to be conservative. Lets see...
1024 Gigabytes = 1 Terabyte
1024 Terabytes = 1 Petabyte
1024 Petabytes = 1 Exabyte
1024 Exabyte = 1 Yottabyte
So you need roughly 21 Yottabytes of RAM. There are few computers in the word with that much memory installed. I'd be willing to bet I can count the number of such machines on the fingers of one foot.
The problem with computers is people think they are infinitely large, infinitely fast. People get spoiled. They can solve a small problem, so what the heck, I'll just make a much larger problem. And then they get upset when they run out of memory. A simple truism about computing:
Computing problems always expand to be just a bit larger than any computer you can find.
I'm sorry, but you can't do what you want, at least as you want to solve it. At least not until you buy that quantum computing machine that Starfleet command sells. The problem there is the shipping. They need to send it from the future, so the shipping costs tend to be really high. Of course, you can contact the borg since they ship everything for free, but do you really want the borg to be visiting your home or place of work?
Far better is to re-think your problem. Good mathematics, good numerical methods is often the solution to solving a problem that is too difficult to handle by brute force.

Walter Roberson on 20 Jun 2022
Edited: Walter Roberson on 21 Jun 2022
IBM has a 120 petabyte drive. That is about 1.2e17 bytes.
The desired array is 18 million times larger.
In some cases, you can use Tall Arrays, but you have to have backing drive space for that, and we can be very certain that you do not have access to that much disk space.

Steven Lord on 21 Jun 2022
The maximum number of rows you can have in a matrix in MATLAB is the second output of the computer function, and you're unlikely to have enough memory to get close to that unless you create your matrix as a sparse matrix (which is possible because you're trying to create a tall, thin matrix.)
[~, maxsize] = computer
maxsize = 2.8147e+14
A = sparse(maxsize, 27)
A =
All zero sparse: 281474976710655×27
B = sparse(maxsize + eps(maxsize), 27)
Error using sparse
Sparse matrix sizes must be nonnegative integer scalars less than MAXSIZE as defined by COMPUTER. Use HELP COMPUTER for more details.