Find the common eigenvectors and eigenvalues between 2 matrices

Hello,
I am looking for finding or rather building common eigenvectors matrix X between 2 matrices A and B such as :
AX=aX with "a" the diagonal matrix corresponding to the eigenvalues
BX=bX with "b" the diagonal matrix corresponding to the eigenvalues
I took a look in a similar post topic https://stackoverflow.com/questions/56584609/finding-the-common-eigenvectors-of-two-matrices but had not managed to conclude, i.e having valid results when I build the final wanted endomorphism F defined by : F = P D P^-1
I have also read the wikipedia topic on Wikipedia and this interesting paper https://core.ac.uk/download/pdf/82814554.pdf but couldn't have to extract methods pretty easy to implement.
Particularly, I am interested by the
eig(A,B)
Matlab function.
I tried to use it like this :
% Search for common build eigen vectors between FISH_sp and FISH_xc
[V,D] = eig(FISH_sp,FISH_xc);
% Diagonalize the matrix (B^-1 A) to compute Lambda since we have AX=Lambda B X
[eigenv, eigen_final] = eig(inv(FISH_xc)*FISH_sp);
% Compute the final endomorphism : F = P D P^-1
FISH_final = V*eye(7).*eigen_final*inv(V)
But the matrix `FISH_final` don't give good results since I can do other computations from this matrix FISH_final (this is actually a Fisher matrix) and the results of these computations are not valid.
So surely, I must have done an error in my code snippet above.
If someone could help me to build these common eigenvectors and finding also the eigenvalues associated, this would be fine to tell it, I am a little lost between all the potential methods that exist to carry it out.

2 commentaires

Hi petit,
It appears that A and B known matrices and are you are looking for one or more X that are an eigenvector for both A and B. Be aware that
AX=aX BX=bX
is not at all the same as the eig(A,B) problem (shown here for a single eigenvalue lambda)
A*X = B*X*lambda
In the second case, for nonsingular B the problem reduces to the standard eigenvalue problem
(inv(B)*A)*X = X*lambda
with one set of eigenvalues, not two.
petit
petit le 4 Jan 2021
Modifié(e) : petit le 4 Jan 2021
Hi David,
So how could I find a common eigen vectors ? I didn't find a lot of documentation on the web to carry out this kind of problem.
but I have difficulties to build the cmmon eigen vectors. Below the code whose method is described with FISH_sp and FISH_sp the 2 starting matrices :
% Search for common build eigen vectors between FISH_sp and FISH_xc
[V1,D1] = eig(FISH_sp);
[V2,D2] = eig(FISH_xc);
% Check espilon
for i=1:7
tol=sum(abs(FISH_sp*V1(:,i)-D1(i)*V1(:,i)));
tol
tol=sum(abs(FISH_xc*V2(:,i)-D2(i)*V2(:,i)));
tol
end
% Sorting
[d1,I1]=sort(diag(D1))
[d2,I2]=sort(diag(D2))
% Identify repeated elemnts
[~,ia,~]=unique(d1,'stable')
[~,ib,~]=unique(d2,'stable')
% Find a same space
W1 = V1(:,I1)
W2 = V2(:,I2)
% Loop for rref
m = zeros(numel(ia,ib));
m = zeros(numel(ia,ib));
numel(ia)
numel(ib)
for i=1:numel(ia)
for j=1:numel(ib)
check_linear_dependency(col1,col2);
[R,p] = rref(W1(:,ia(i):ia(i+1)-1), W2(:,ib(j):ib(j+1)-1));
end
end
%%%%%%%%%%%%%% I don't know after loop what to do to build common eigenvectors %%%%%%%%%%%%%
As you can see, I don't know the next step to do in order to build the matrix of common eigenvectors.
If someone could help me this would be fine.
Regards

Connectez-vous pour commenter.

 Réponse acceptée

David Goodmanson
David Goodmanson le 5 Jan 2021
Modifié(e) : David Goodmanson le 5 Jan 2021
Hi petit,
Eigenvectors calculated by Matlab are normalized, but neither (a) the the overall phase of each one or (b) the order ot the eigenvalues and the corresponding columns of the eigenvectors are guaranted to be anything in particular. But if AX = aX and BX = bX, then for
[vA lambdaA] = eig(A)
[vB lambdaB] = eig(B)
there will be a case where a column of Va and a column vB differ by only a phase factor. So in the matrix product vA'*vB there be an entry of absolute value 1, the phase factor.
n = 6;
% set up matrics A and B with two eigenvectors in common
v = 2*rand(n,n)-1 +i*(2*rand(n,n)-1); % eigenvalues
w = 2*rand(n,n)-1 +i*(2*rand(n,n)-1);
lamv = rand(n,1); % eigenvectors
lamw = rand(n,1);
% two common eigenvectors
w(:,2) = v(:,3);
w(:,4) = v(:,5);
A = (v*diag(lamv))/v;
B = (w*diag(lamw))/w;
% given A and B, find the common eigenvectors
[vA lamA] = eig(A);
[vB lamB] = eig(B);
vAvB = vA'*vB;
[j k] = find(abs(abs(vAvB)-1)<1e-12)
% show that the jth A eigenvector and kth B eigenvector are proportional
vA(:,j(1))./vB(:,k(1))
vA(:,j(2))./vB(:,k(2))
This method works as long as for the eigenvectors in question, the eigenvalues are distinct. If there are repeated eigenvalues, then if A has eigenvectors x and y, B might have eigenvectors that are linear combinartions of x and y. Then the job gets a lot harder.
You can also use the fact that
(A-B)X = (a-b)X
to look for cases where an eigenvalue of (A-B) equals (a-b), where a is one of the eigenvalues of A and B is one of the eigenvalues of B. However, this method is likely to be more prone to false positives than is the first method.

8 commentaires

petit's incorrectly posted "Answer" moved here:
Hello David,
I tried to apply your solution (I put the 2 matrices in attachment) :
N = 7;
Nsq2 = 2*N*N;
% Load spectro and WL+GCph+XC
FISH_GCsp = load('Fisher_GCsp_flat.txt');
FISH_XC = load('Fisher_XC_GCph_WL_flat.txt');
% Marginalizing over uncommon parameters between the two matrices
COV_GCsp_first = inv(FISH_GCsp);
COV_XC_first = inv(FISH_XC);
COV_GCsp = COV_GCsp_first(1:N,1:N);
COV_XC = COV_XC_first(1:N,1:N);
% Invert to get Fisher matrix
FISH_sp = inv(COV_GCsp);
FISH_xc = inv(COV_XC);
% Diagonalize
[V1,D1] = eig(FISH_sp);
[V2,D2] = eig(FISH_xc);
V1V2 = V1'*V2
[j k] = find(abs(abs(V1V2)-1)<1e-12)
% show that the jth A eigenvector and kth B eigenvector are proportional
V1(:,j(1))./V2(:,k(1))
V1(:,j(2))./V2(:,k(2))
But I get the following error :
V1V2 =
-0.6207 0.7283 0.0916 -0.1346 0.0084 -0.0050 0.2404
-0.1224 0.1870 -0.1415 -0.2043 -0.0057 0.0189 -0.9423
0.4838 0.4167 -0.7273 -0.1759 -0.0659 -0.0013 0.1676
-0.6016 -0.4263 -0.6582 0.1319 -0.0371 0.0103 0.0643
0.0607 0.2802 -0.0699 0.9359 0.1219 0.0341 -0.1448
0.0070 -0.0280 -0.0653 -0.1238 0.9877 0.0585 0.0253
0.0035 -0.0029 0.0152 -0.0231 -0.0617 0.9975 0.0221
j =
0x1 empty double column vector
k =
0x1 empty double column vector
Index exceeds the number of array elements (0).
Error in compute_solving_matlab_forum_dev (line 38)
V1(:,j(1))./V2(:,k(1))
It seems Matlab can't find any common basis of eigenvectors between matrix FISH_GCsp and FISH_XC.
Could you see please at first sight what's wrong ?
Regards
petit's "Answer" moved here:
By the way, I havei indices j and k different from 0 only if I take the condition on "find" to be greater or equal than 1e-2.
>> [j k] = find(abs(abs(V1V2)-1)<1e-1)
j =
5
6
7
2
k =
4
5
6
7
>> [j k] = find(abs(abs(V1V2)-1)<1e-2)
j =
7
k =
6
>> [j k] = find(abs(abs(V1V2)-1)<1e-3)
j =
0x1 empty double column vector
k =
0x1 empty double column vector
Hi petit,
Given the process you do to obtain FISH_sp and FISH_xc, how do you know that there is an eigenvector in common?
If the data is experimental, you might have to increase your error tolerance by quite a bit as you are doing. Running your code with a tolerance of 1e-2, I get
j = 1
k = 1
which is different from your j = 7, k = 6 for reasons I don't understand. Then
V1V2
ans =
0.9975 -0.0221 0.0617 -0.0231 0.0152 -0.0029 0.0035
0.0585 -0.0253 -0.9877 -0.1238 -0.0653 -0.0280 0.0070
0.0341 0.1448 -0.1219 0.9359 -0.0699 0.2802 0.0607
-0.0103 0.0643 -0.0371 -0.1319 0.6582 0.4263 0.6016
-0.0013 -0.1676 0.0659 -0.1759 -0.7273 0.4167 0.4838
-0.0189 -0.9423 -0.0057 0.2043 0.1415 -0.1870 0.1224
-0.0050 -0.2404 -0.0084 -0.1346 0.0916 0.7283 -0.6207
V1V2(1,1) is the closest to equaling 1.
With a tolerance of 1e-2, a ratio demo of the type the V1(:,j(1))./V2(:,k(1)) is not useful, but
V1(:,j(1)) - V2(:,k(1))
ans =
-0.0186
-0.0044
-0.0533
-0.0165
-0.0132
0.0324
0.0196
showing that the differnce of the two eigenvectors is consistent with the tolerance. I believe the method is correct but you will have to decide what works or not.
petit
petit le 5 Jan 2021
Modifié(e) : petit le 5 Jan 2021
Hi David !
I have not the same values for V1V2 and I don't understand why :
V1V2 =
-0.6207 0.7283 0.0916 -0.1346 0.0084 -0.0050 0.2404
-0.1224 0.1870 -0.1415 -0.2043 -0.0057 0.0189 -0.9423
0.4838 0.4167 -0.7273 -0.1759 -0.0659 -0.0013 0.1676
-0.6016 -0.4263 -0.6582 0.1319 -0.0371 0.0103 0.0643
0.0607 0.2802 -0.0699 0.9359 0.1219 0.0341 -0.1448
0.0070 -0.0280 -0.0653 -0.1238 0.9877 0.0585 0.0253
0.0035 -0.0029 0.0152 -0.0231 -0.0617 0.9975 0.0221
j =
7
k =
6
ans =
2.4058
3.0961
1.2158
0.9829
0.0028
-15.0657
2.4378
Index exceeds the number of array elements (1).
Error in test_fish (line 23)
V1(:,j(2))./V2(:,k(2))
To try to have the same values, below you can see the script which produces thses results :
clear
N = 7;
Nsq2 = 2*N*N;
% Load spectro and WL+GCph+XC
FISH_GCsp = load('Fisher_GCsp_flat.txt');
FISH_XC = load('Fisher_XC_GCph_WL_flat.txt');
% Marginalizing over uncommon parameters between the two matrices
COV_GCsp_first = inv(FISH_GCsp);
COV_XC_first = inv(FISH_XC);
COV_GCsp = COV_GCsp_first(1:N,1:N);
COV_XC = COV_XC_first(1:N,1:N);
% Invert to get Fisher matrix
FISH_sp = inv(COV_GCsp);
FISH_xc = inv(COV_XC);
% Diagonalize
[V1,D1] = eig(FISH_sp);
[V2,D2] = eig(FISH_xc);
V1V2 = V1'*V2
[j k] = find(abs(abs(V1V2)-1)<1e-2)
% show that the jth A eigenvector and kth B eigenvector are proportional
V1(:,j(1))./V2(:,k(1))
V1(:,j(2))./V2(:,k(2))
Maybe you will see differences with your own code.
Concerning the common basis of aigen vectors, I am looking for a combination (vectorial or matricial) of V1 and V2 to build this new basis "P" in which, with others eigenvalues than known D1 and D2 (noted D1' and D2'), we could have :
F = P (D'1+D'2) P^-1
To compute the new Fisher matrix F, I need to know "P", "D'1" and ""D'2"".
but for the moment, the results are pretty bad when I compute the constraints on each parameter.
Hoping you will understand and could give me suggestion/track/clue ,
Regards
Hi petit,
all I can say is, I reimported your two files, recopied your code, ran in and still got j=1 k=1, who knows why. However, the more important question is how well the code works or not at achieving the goal. (As I mentioned the method only addresses cases where all the eigenvalues of A are unique and all the eigenvalues of B are unique. A and B can have eigenvalues in common though).
You will have to decide at what tolerance level an eigenvector of A is judged to be equal to an eigenvector of B.
At this point I don't think that the ratio expression for demo purposes,
vA(:,j(1))./vB(:,k(1))
is very good, even though I suggested it myself. The difference,
vA(:,j(1)) - vB(:,k(1))
makes more sense, on one condition. The condition is, if the A eigenvector and the B eigenvector differ by some constant factor, removing that factor first.
Hi David !
why do you compute :
V1V2 = V1'*V2
instead of simply :
V1V2 = V1*V2
I mean, without the transpose operator for V1 ?
For instant, one suggested me that from study of eigen space by doing :
clear
N = 7;
Nsq2 = 2*N*N;
% Load spectro and WL+GCph+XC
FISH_GCsp = load('Fisher_GCsp_flat.txt');
FISH_XC = load('Fisher_XC_GCph_WL_flat.txt');
% Marginalizing over uncommon parameters between the two matrices
COV_GCsp_first = inv(FISH_GCsp);
COV_XC_first = inv(FISH_XC);
COV_GCsp = COV_GCsp_first(1:N,1:N);
COV_XC = COV_XC_first(1:N,1:N);
% Invert to get Fisher matrix
FISH_sp = inv(COV_GCsp);
FISH_xc = inv(COV_XC);
% Diagonalize
[V1,D1] = eig(FISH_sp);
[V2,D2] = eig(FISH_xc);
V1V2 = V1'*V2
[j k] = find(abs(abs(V1V2)-1)<1e-2)
% show that the jth A eigenvector and kth B eigenvector are proportional
V1(:,j(1))./V2(:,k(1))
As you can see, I have put a tolerance equal to 1e-2. With this tolerance, I get only j=7 and k=6.
With a tolerance of 1e-1, I get :
j =
5
6
7
2
k =
4
5
6
7
If I print the operator null on the comutator between the matrices FISH_sp and FISH_xp, I get a 7x1 colum vector :
>> null(FISH_sp*FISH_xc-FISH_xc*FISH_sp)
ans =
-0.0085
-0.0048
-0.2098
0.9776
-0.0089
-0.0026
0.0109
What should I conclude ? How can I do to have 7 columns vectors to build a passing matrix P ? if I set the tolerance to 1.0, there are too many values in j and k.
For
[vA lamA] = eig(A)
[vB lamB] = eig(B)
then the columns of vA are the eigenvectors (evecs). Same for vB. You want to find if any column of A is the same as some column of B. All the evecs are normalized to 1 but unfortunately, an evec can be multiplied by a phase factor exp(i*theta) (which includes the phase factor -1) and still be an evec. No guarantee what eig produces, so an evec of A and an evec of B can differ by a phase factor. So you have a couple of choices.
[1] compare each evec eA of A with each evec eB of B, (n^2 cases), determine the phase factor between them, take out the phase factor, look at the difference eA-eB and decide if it's small enough. But unless both eA and eB are real, the phase factor issue is harder than it looks.
[2]
Use the fact that if eA and eB are almost equal, then since they are normalized their dot product eA'*eB will close to 1. Here eA' turns column vector to row vector; and row vector times column vector eB is the scalar dot product. So you need the transpose. Multiplying the matrix vA' by the matrix vB automatically finds all n^2 possible dot products of a column of A with a column of B and you can search the resulting matrix for values near 1.
As far as tolererances, you have to decide for yourself what is appropriate. In this case 1e-2 gives 1 equality, and as you point out, 1e-1 gives 4 equalites. If you use 1e-12 then nothing is equal to anything, and if you use 1e1 then everything equals everything else. Maybe your data is imprecise enough that there should not be any equalities. It's a judgment call.
I'm not sure what to make of null(FISH_sp*FISH_xc-FISH_xc*FISH_sp) having a nonempty null space.
null(FISH_sp*FISH_xc-FISH_xc*FISH_sp)
returns the basis of subspace where the two linear operator commutes when restricted on the subspace.
As example I(n) and 2*I(n) (or any matrix)
commutes, the above NULL returns n vectors, yet they the operators not share any eigen vector.

Connectez-vous pour commenter.

Plus de réponses (2)

Bruno Luong
Bruno Luong le 4 Jan 2021
Modifié(e) : Bruno Luong le 4 Jan 2021
K = null(A-B);
[W,D] = eig(K'*A*K);
X = K*W, % common eigen vectors
lambda = diag(D), % common vector

13 commentaires

if I do your suggestion, I get :
>> V = null(FISH_sp-FISH_xc)
V =
7x0 empty double matrix
Is it normal ?
Then there is NO common eigen-value/vector.
I think you didn't understand fully what I wanted.
I have a first eigen vectors matrix V1 and a second one V2. I want to build a basis of common eigen vectors like for example under the form :
V = a V1 + b V2
where "a" and "b" are 2 matrices to determine.
I am not looking for identical vectors between V1 and V2. I hope you will understand.
Regards
I give you an example of what I understand, if I generate two (n x n) matrices having m common eigen vectors/eigan values
n=5;
m=2;
p=n-m;
cD=rand(1,m);
cV=rand(n,m);
AD=[cD rand(1,p)];
AV=[cV rand(n,p)];
BD=[cD rand(1,p)];
BV=[cV rand(n,p)];
A=(AV*diag(AD))/AV
B=(BV*diag(BD))/BV
Then if you apply my method it'll find those two common eigen vectors/values.
K = null(A-B);
[W,D] = eig(K'*A*K);
X = K*W, % common eigen vectors
lambda = diag(D), % common vector
Meaning for those three quantities are equal (in one specfic example)
> A*X
ans =
0.0117 0.0444
0.0264 0.0585
0.0562 0.0463
0.0543 0.0730
0.0041 0.0698
>> B*X
ans =
0.0117 0.0444
0.0264 0.0585
0.0562 0.0463
0.0543 0.0730
0.0041 0.0698
>> X*D
ans =
0.0117 0.0444
0.0264 0.0585
0.0562 0.0463
0.0543 0.0730
0.0041 0.0698
Hello Bruno,
I put in attachment the 2 matrices where I need to find a common basis of eigen vectors (7x7).
if I do :
% Load spectro and WL+GCph+XC
FISH_GCsp = load('Fisher_GCsp_flat.txt');
FISH_XC = load('Fisher_XC_GCph_WL_flat.txt');
% Marginalizing over uncommon parameters between the two matrices
COV_GCsp_first = inv(FISH_GCsp);
COV_XC_first = inv(FISH_XC);
COV_GCsp = COV_GCsp_first(1:N,1:N);
COV_XC = COV_XC_first(1:N,1:N);
% Invert to get Fisher matrix
FISH_sp = inv(COV_GCsp);
FISH_xc = inv(COV_XC);
K = null(FISH_sp-FISH_xc);
[W,D] = eig(K'*FISH_sp*K);
V = K*W % common eigen vectors
lambda = diag(D) % common vector
then I get as results :
V =
7x0 empty double matrix
lambda =
[]
This is not what I expected.
I have no comment on what do you expect with your matrice data.
I simply reply what you ask for in your original question: finding X such that
AX = BX = X*D
with D diagonal.
I claim that my method give a full-rank basis solution of the above equation.
petit
petit le 4 Jan 2021
Modifié(e) : petit le 4 Jan 2021
this is here where there is a confusion :
AX=BX=XD
I want to build X such as : AX = D1 X and BX = D2 X
where D1 = eig(A) and D2 = eig(B)
I hope you will understand.
Diagonal of D given by my solution are common eigen value of A and B.
Because you want
AX = BX
If X are eigen vectors then the above equality is
X*Da = X*Db
is true for some diagonal matrices Da and Db, there for Da==Db. I just call it D in my code.
And if
AX = BX
as YOU ask then
(A-B)*X = 0
Therefore
X belongs to NULL(A-B)
I want sincerely to believe you but at the first line of your code snippet, I get :
>> K = null(FISH_sp-FISH_xc)
K =
7x0 empty double matrix
that is, an empty double matrix.
Bruno Luong
Bruno Luong le 4 Jan 2021
Modifié(e) : Bruno Luong le 4 Jan 2021
I repeat myself:
"Then there is NO common eigen-value/vector."
But i don't want common eigein-values, I just want common eigen-vectors.
I am currently looking for a common eigen-vectors basis by using Global search of minimum : you can take a look at : Solve matricial equations system
In this link, I try to fill 2 matrices "a" and 'b" that could form a common eigen-vectors basis under the form :
V = a V1 + b V2
I don't think this is not possible, but even if I have an approximation, I would be glad.
Regards
Quote: " I just want Common eigen vectors" meaning
AX = BX
This is equivalent to
(A-B)*X = 0
Therefore
X belongs to span of NULL(A-B)
If NULL returns empty result then there is NO common eigen vector.
This also implies eigen values are common, this is a consequence of YOUR request, not because I add it as an extra requirement. If you don't understand it you do not understand the math logic.
Might be you redefine the meaning of word "common"? If you do then right I don't understand what you want.
petit
petit le 4 Jan 2021
Modifié(e) : petit le 4 Jan 2021
As you will see, I am looking the expression of D' diagonal matrix as a function of D1 and D2 that come from eig(FISH_sp) and eig(FISH_xc).
This way, I could get a new Fisher matrix F = P D' P^-1 with P the "common" basis of eigenvectors and D' the diagonal matrix which depends of eigenvalue of eig(FISH_sp) and eig(FISH_xc).
Particulary, I mention the fact that if I want to make the combination of these 2 Fisher informations (called also "cross-correlation), I can't build a "combinated" Fisher matrix directly by summing the 2 diagonal matrices since the linear combination of random variables is different between the 2 Fisher matrices after diagonalisation.
That's why I want to find this "P" common matrix, depending of P1 and P2 (respectively from eigen vectors of FISH_sp and FISH_xc).
It may be tricky to understand but there is also something which is logic in my reasoning, even if I have difficulties to express the D' diagonal matrix as a function of D1 and D2.

Connectez-vous pour commenter.

petit
petit le 23 Jan 2021
Modifié(e) : petit le 23 Jan 2021
Hi Bruno !
Finally, I have just found a unique common eigenvector coming from :
null(A*B-B*A)
and which is equal to :
ans =
-0.0085
-0.0048
-0.2098
0.9776
-0.0089
-0.0026
0.0109
How could I exploit this single common eigen vector to build an approximative (everything is relative, that's why I would like to set a tolerance factor) common basis between A and B ?
I thought about a Gram-Schmidt process ot build this common basis but I am sure this is correct.
Any suggestion/clue/help is welcome.

Catégories

En savoir plus sur Linear Algebra dans Centre d'aide et File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by