How to create a 3D matrix from subtracting 2D matrices (like creating 2D matrix from subtracting vectors)?

1 view (last 30 days)
Hi,
I wish to extend the behavior of subtracting a row vector and a column vector to matrices. Let me provide an example: Suppose A and B are two n x n matrices. I want to obtain matrix C such that C(:,:,i) = A(:,i)-B(i,:), i runs from 1 to n.
Is there a way to do this without for-loops?
Thanks,
Mohit.

Accepted Answer

Stephen23
Stephen23 on 19 Jan 2022
C = permute(A,[1,3,2]) - permute(B,[3,2,1])
  2 Comments
Stephen23
Stephen23 on 19 Jan 2022
format compact
A = randi(9,3,3)
A = 3×3
5 9 5 1 5 9 5 9 6
B = randi(9,3,3)
B = 3×3
6 8 6 8 7 3 5 9 5
C = permute(A,[1,3,2]) - permute(B,[3,2,1])
C =
C(:,:,1) = -1 -3 -1 -5 -7 -5 -1 -3 -1 C(:,:,2) = 1 2 6 -3 -2 2 1 2 6 C(:,:,3) = 0 -4 0 4 0 4 1 -3 1

Sign in to comment.

More Answers (0)

Products


Release

R2020b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!