Building sparse matrix inside parfor
Show older comments
I'm building a large sparse matrix in smaller pieces. Unfortunately the pieces overlap a bit. At the moment I'm building each piece to match the final size and after the loop I sum the pieces together. I have tried following approaches
1) I tried summing up sparse matrixes inside parfor. Bad idea. Produces full matrix.
2) Build index vectors for each piece of the matrix and combine the index vectors inside parfor. Then use one sparse command after the loop to build the final matrix. This, unfortunately, is rather slow. The reason might be the repetitive entries that the sparse command needs to sum up.
3) Build sparse matrix of each piece and store them in cell array inside parfor. Then sum up the sparse matrixes inside regular for loop. This is the best so far; fast and reliable. (See the pseudocode below.)
4) This is the problematic case: Build sparse vectors out of each piece and store them in cell array. Then sum up the sparse vectors inside regular for loop, and reshape to matrix. Unfortunately for larger systems it crashes with
Error using parallel_function (line 598)
Error during serialization
Error stack:
remoteParallelFunction.m at 31
As a for loop it runs just fine.
Below is some pseudocode to shed light on what I'm doing:
First option 3) that always works.
Aset = cell(1,Nsets) ;
parfor S=1:Nsets
% Do lots of stuff to get iind, jind, Aval
Aset{S} = sparse( iind, jind, Aval, Ndof, Ndof );
end
A = Aset{1};
for S=2:Nsets
A = A + Aset{S} ;
end
Option 4) that gives the error:
Aset = cell(1,Nsets) ;
parfor S=1:Nsets
% Do lots of stuff to get iind, jind, Aval
matind = iind +Ndof*( jind-1 );
Aset{S} = sparse( matind, ones(size(matind)), Aval, Ndof*Ndof, 1 ) ;
end
A = Aset{1};
for S=2:Nsets
A = A + Aset{S} ;
end
A = reshape(A,Ndof,Ndof) ;
Any ideas why option 4 crashes? How should I do this to gain speed?
The size of the final matrix, i.e. Ndof, is few millions. Number of matrix pieces, i.e. Nsets, is 10 to 30. For option 3 it takes roughly 30 seconds to sum the matrixes of size Ndof=4000000.
Accepted Answer
More Answers (2)
Three years too late! There is a workaround to parfor expanding the sparse matrix by using a function handle. See code below for an example.
m = 1e5;
n = 1e5;
A = sparse(m,n);
fcn = @plus;
parfor k = 1:100
i = randi(m,10);
j = randi(n,10);
s = randn(10);
A = fcn(A, sparse(i,j,s,m,n));
end
Hope it helps!
3 Comments
Isaac
on 1 Dec 2014
I just wanted to say that this was the fasted way for me to construct a huge sparse matrix in parallel. It took 6 seconds, compared to 37 seconds for creating the i,j,v vectors in parallel and then calling sparse(i,j,v). The main bottleneck seems to be that sparse() calls are always serial. Seems like Matlab should make sparse() utilize many processors if they are available, as this behavior clearly works with the method above (using @plus and a reduction variable).
SE
on 3 Aug 2018
Just logged in to tell you that you're a lifesaver! What strange behaviour... I suppose the symbolic addition is what wants inputs to be full rather than sparse? Thanks again!
Fintan Healy
on 12 Jan 2025
m = 1e5;
n = 1e5;
A = sparse(m,n);
parfor k = 1:100
i = randi(m,10);
j = randi(n,10);
s = randn(10);
A = A + sparse(i,j,s,m,n);
end
as of 2024b the "fcn" wrapper is no longer required, and this was the fastest option for me.
Mika
on 29 Dec 2011
3 Comments
Sean de Wolski
on 29 Dec 2011
I can't explain why (yet) but it must be related to memory. Changing Ndof to 1e4 makes your above code work.
Friedrich
on 29 Dec 2011
This seems like a bug to me. There isn't any reason why it shouldn't work. You have a limit of 2gb on 64bit and 600mb on 32bit regarding the amount of data which can be transfered from MATLAB to the workers and back. You are far away from that limit. Since it works with small values it should work with bigger one too.
Mika
on 29 Dec 2011
Categories
Find more on Creating and Concatenating Matrices in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!