Compute approximative common eigenvectors basis between two matrices as a function of tolerance

17 views (last 30 days)
SUMMARIZE :
Given 2 matrices A and B that don't commute, I am looking for finding or rather building an approximative common eigenvectors basis X between 2 matrices A and B such as : A X_i = a_i X_i and B X_i = b_i X_i with set (a_i) and (b_i) are the eigen values respectively of A and B. I tried different things to perform this like SVD algorithms on commutator of A*B, Solving Matricial equations to build the eigen common vectors, Pool variance methods but none gives acceptable results. So I wonder if someone could know a method elaborated to approximate this common basis of eigen vectors or simply make suggestions or give tracks. Any help is welcome for this combination of probes.
ISSUE :
I am looking for finding or rather building common eigenvectors matrix X between 2 matrices A and B such as :
AX=aX
with "a" the diagonal matrix corresponding to the eigenvalues
BX=bX
with "b" the diagonal matrix corresponding to the eigenvalues
where A and B are square and diagonalizable matrices.
1) I took a look in a similar post but had not managed to conclude, i.e having valid results when I build the final wanted endomorphism F defined by : F = P D P^-1
2) I have also read the wikipedia topic and this interesting paper but couldn't have to extract methods pretty easy to implement.
3) From maths exchange, one advices to use Singular values Decomposition (SVD) on the commutator [A,B], that is in Matlab doing by :
"If 𝑣 is a common eigenvector, then ‖(𝐴𝐵−𝐵𝐴)𝑣‖=0. The SVD approach gives you a unit-vector 𝑣 that minimizes ‖(𝐴𝐵−𝐵𝐴)𝑣‖ (with the constraint that ‖𝑣‖=1)"
So I extract the approximative eigen vectors V from :
[U,S,V] = svd(A*B-B*A)
4) Is there a way to increase the accuracy to minimize ‖(𝐴𝐵−𝐵𝐴)𝑣‖ as much as possible, I mean for a tolerance as small as possible ?
Are there alternative methods or routines to perform this minimization of commutator combined with the vector 𝑣
, that is ‖(𝐴𝐵−𝐵𝐴)𝑣‖ ?
I saw there is another function called rref which can accept a tolerance parameter but :
  1. What's the difference with singular values decomposition svd
  2. Which criterion could I apply for a pertinent choice of tolerance value
The 2 matrices to find approximative common eigen vectors matrix are available here :
Anyone could try to apply a function Matlab appropriate to find a basis of common eigen vectors or write a small Matlab script for this ? Even approximative basis would be enough, everything depends of the tolerance that I am ready to accept but currently I don't know how to introduce this tolerance parameter with SVD algorithm.
5) UPDATE : among different methods that I have tried to use, anyone could explain the method of Pool Variance ? If it can be easy to implement. I remember that it is consisted by taking the half of each diagonalised Fisher matrices and sum them to come back after into final Covariance space, i.e by just applying :
Cov = P Fisher_diagonal_sum P^-1
?
With this method, I get interesting results, but from a theorical point of view, impossible for me to justify the principle of this Pool variance matrix (that is, by taking the half of diagonal Fisher matrices values and come back to Covariance) : why the half ?
6) In pooled variance, How to quantify the relative amount of Information between 2 Fisher informations ? : maybe I could then justify this contribtuion of 50% for A and 50% for B matrices
Any suggestion/track/clue help is welcome
  3 Comments
petit
petit on 22 Jan 2021
yes, thanks, I have justed asked to @Matt J how could I combine the 2 generalized eigenvector and values in order to approximate the closest I can get.
But as you can see, I have difficulties to mix the first generalized problem (eig(A,B)) and the second one (eig(B,a)).
Could you see by chance the method to apply to combine these 2 generalized problem ?
Best Regards
petit
petit on 25 Jan 2021
Hi @Matt J !
1) could you help me please to reformulate the issue that I get about the generalized eigenvectors and eigenvalues ?
I understand the principal of this generalized eigenvectors but have difficulties to implement it well in order to find as best as possible a common aigen basis. You can see a try on the following link.
Excepted this attempt with generalized eigenvectors eigenvalues problem, I found out the name of a method to apply : joint-diagonalization algorithms.
By doing researches, I found 2 codes :
  1. https://github.com/gabrieldernbach/approximate_joint_diagonalization
  2. https://github.com/pierreablin/qndiag
The first one is difficult to make run, especially for the GPU version and only Python version
The second one is much better easy to run but there is no GPU version.
2) So, for the moment, regarding Matlab, I could have only used the second one. Working with Fisher matrices, the constraints I get are relative bad for a majority of parameters (sigma too high) execpted for the 2 parameters from which I compute the Figure of Merit : I get an FoM equal to 1600, which was expected.
But the problem is that constraints on other parameters are very bad.
If someone could give me feedback about these joint-diagonalization algorithms, I would be grateful since I would like to increase the constraints of others parameters.
Here my script :
import os, sys
import numpy as np
from qndiag import qndiag
# dimension
m=7
# number of matrices
n=2
# Load spectro and WL+GCph+XC
FISH_GCsp = np.loadtxt('Fisher_GCsp_flat.txt')
FISH_XC = np.loadtxt('Fisher_XC_GCph_WL_flat.txt')
# Marginalizing over uncommon parameters between the two matrices
COV_GCsp_first = np.linalg.inv(FISH_GCsp)
COV_XC_first = np.linalg.inv(FISH_XC)
COV_GCsp = COV_GCsp_first[0:m,0:m]
# Invert to get Fisher matrix
FISH_sp = np.linalg.inv(COV_GCsp)
FISH_xc = np.linalg.inv(COV_XC)
# Drawing a random set of commuting matrices
C=np.zeros((n,m,m));
B=np.zeros((m,m));
C[0] = np.array(FISH_sp)
C[1] = np.array(FISH_xc)
# Perform operation of diagonalisation
B, _ = qndiag(C, None, None, int(1e25), 1e-25);
# Print diagonal matrices
M0 = np.dot(np.dot(B,C[0]),B.T)
M1 = np.dot(np.dot(B,C[1]),B.T)
# Doing the sum of the inverse of variance
FISH_final = M0 + M1
# Save Fisher_final
np.savetxt('Fisher_final.txt',FISH_final)
Any help is welcome to implement the generalized problem. The ideal would be to find relative similarity at the level of constraints between methods.
Best. regards

Sign in to comment.

Answers (2)

Matt J
Matt J on 22 Jan 2021
Edited: Matt J on 22 Jan 2021
rather building an approximative common eigenvectors basis X between 2 matrices A and B such as : A X_i = a_i X_i and B X_i = b_i X_i
If you were to find such an X_i, and if a_i is non-zero, then X_i and b_i/a_i would be a generalized eigenvector and value. So, perhaps you should be looking to eig(A,B) and/or eig(B,A)?
  3 Comments
Bjorn Gustavsson
Bjorn Gustavsson on 22 Jan 2021
1, Why do you think that your problem should have a solution?
2, If your matrices have a couple of common eigenvectors, why can't you just look for that subset?
petit
petit on 23 Jan 2021
  1. My problem has not an analytical solution but I think we can get an approximative solution, I mean from a numerical point of view
  2. I don't know how to handle the unique common eigenevctor, since I need a basis to be able to apply the MLE (Maximum Likelihood Estimator), i.e by summing the eigenvalues of A and B on diagonal.

Sign in to comment.


petit
petit on 23 Jan 2021
Edited: petit on 23 Jan 2021
By approximative, I mean that the eigenvectors of first matrix (passing matrix) looks like to the eigenvectors of second matrix . You will tell me : up to which point ? that's the problem when I say "looks like" or "approximative" or even "similar". Only a tolerance factor will allow me to choose or not to consider the 2 matrices as "similar" or "approximative similar". But I can't see for the moment how to introduce this "tolerance" factor in my code and in the building of this pseudo common eigenvectors matrix, I say "pseudo" since there are no analytical solutions.
What's the smallest tolerance we can reach ?, I don't know but I guess at some point, there will be a saturation regarding the similarity between the 2 basis : so I will have to make a choice or a comparison between both.
By the way, I tried to use the Matlab function eig(A,B) in my case for the generalized eigenvector and value problem :
The origin form is :
(trying below to put it in Latex) :
Generalized Eigenvalue Problem :
The generalized eigenvalue problem (Parlett, 1998 Golub \& Van Loan, 2012) of two symmetric matrices and is defined as:
$$
and in matrix form, it is:
where the columns of are
the eigenvectors and diagonal elements of are the eigenvalues. Note that
and
So, I tried to do :
[eigenv_final, eigen_final] = eig(inv(B)*A,eye(7))
with eigenv_final the matrix of eigen vectors
and eigen_final the diagonal matrix eigen_final.
After that, I try to check the relation :
diagonal_A = (eigenv_final^-1) * A * (eigenv_final)
and
diagonal_B = (eigenv_final^-1) * B * (eigenv_final)
But I don't get diagonal matrices for diagonal_A and diagonal_B matrices.
Could anyone see what's wrong ? Is the generalized eigenvalue and vector problem badly set ?
Best regards.
ps1: I have put in attchment the 2 matrices used in this script :
clear
clc
N = 7;
Nsq2 = 2*N*N;
% Load spectro and WL+GCph+XC
FISH_GCsp = load('Fisher_GCsp_flat.txt');
FISH_XC = load('Fisher_XC_GCph_WL_flat.txt');
% Marginalizing over uncommon parameters between the two matrices
COV_GCsp_first = inv(FISH_GCsp);
COV_XC_first = inv(FISH_XC);
COV_GCsp = COV_GCsp_first(1:N,1:N);
COV_XC = COV_XC_first(1:N,1:N);
% Invert to get Fisher matrix
FISH_sp = inv(COV_GCsp);
FISH_xc = inv(COV_XC);
% Diagonalize
[V1,D1] = eig(FISH_sp);
[V2,D2] = eig(FISH_xc);
% Generalized eigenvalue problem
[eigenv_final, eigen_final] = eig(inv(FISH_xc)*FISH_sp,eye(7));
% Test : BAD RESULTS : test_diag_final1 and test_diag_final2 not diagonal
test_diag_final1 = inv(eigenv_final)*FISH_sp*eigenv_final
test_diag_final2 = inv(eigenv_final)*FISH_xc*eigenv_final
ps2: Sorry for Latex rendering, I didn't manage to write well different key words like \boldsymbol. Any help to fix is welcome.

Categories

Find more on Linear Algebra in Help Center and File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!