DSPRelated.com
Forums

Spectral radius of non-negative matrix

Started by Vikram June 16, 2008
Hello

I have the following question regarding the maximum eigenvalue of a
non-negative matrix.

Suppose you have a square nonnegative matrix

G = [0 transpose(v1) ; v2 F]

where v1 and v2 are non negative column vectors and F is a non
negative matrix with all its diagonal entries equaling zero.

Since F is a principle submatrix of G, we have
Spectral radius(G) >= Spectral radius (F)

Say Spectral radius(F)  < 1.

My question is:
Are there sufficient conditions in terms of the vectors v1 and v2 to
ensure that Spectral radius(G) <1? This is equivalent to asking
sufficient conditions to ensure that G is invertible whenever F is
invertible.

Any references to already existing literature?

Thanks
Vikram
>Suppose you have a square nonnegative matrix > >G = [0 transpose(v1) ; v2 F] > >where v1 and v2 are non negative column vectors and F is a non >negative matrix with all its diagonal entries equaling zero. > >Say Spectral radius(F) < 1. > >My question is: >Are there sufficient conditions in terms of the vectors v1 and v2 to >ensure that Spectral radius(G) <1?
There is an exact answer to your question: Schur complement lemma. According to that, I-G>0 iff 1 - v1' *(I-F)^-1 * v2 > 0, where I is the identity matrix of correct dimension. Hope this helps. Emre
Emre
Thanks for your reply. Let me see if I understood you correctly.

G=[0 v1' ; v2  F]
Given Spectral Radius(F)<1, the condition for ensuring that
Spectral Radius(G)<1, (or alternatively I-G is invertible) is that

1-v1' * (I-F)^{-1}* v2 >0

Before I start looking through Schur complement literature, we do not
need any constraints on G to be semidefinite, correct?

Thanks again!

Vikram


On Jun 16, 10:05 pm, "emre" <egu...@ece.neu.edu> wrote:
> >Suppose you have a square nonnegative matrix > > >G = [0 transpose(v1) ; v2 F] > > >where v1 and v2 are non negative column vectors and F is a non > >negative matrix with all its diagonal entries equaling zero. > > >Say Spectral radius(F) < 1. > > >My question is: > >Are there sufficient conditions in terms of the vectors v1 and v2 to > >ensure that Spectral radius(G) <1? > > There is an exact answer to your question: Schur complement lemma. > > According to that, > > I-G>0 iff 1 - v1' *(I-F)^-1 * v2 > 0, > > where I is the identity matrix of correct dimension. > > Hope this helps. > > E
mre
>Emre >Thanks for your reply. Let me see if I understood you correctly. > >G=[0 v1' ; v2 F] >Given Spectral Radius(F)<1, the condition for ensuring that >Spectral Radius(G)<1, (or alternatively I-G is invertible) is that > >1-v1' * (I-F)^{-1}* v2 >0 > >Before I start looking through Schur complement literature, we do not >need any constraints on G to be semidefinite, correct? > >Thanks again! > >Vikram
Vikram, your understanding seems correct to me with one exception (see below)... and no, G need not be semidefinite.
>Spectral Radius(G)<1, (or alternatively I-G is invertible) is that
Just a warning: although Spectral Radius(G)<1 implies (I-G) is invertible, the two are not equivalent (for example consider the case G=-I.) I believe the equivalent expression is I-G>0, which is the condition to be substituted in Schur's complement lemma [1,pg.2]. Glad I could help. Emre [1] http://www.eecs.berkeley.edu/~wainwrig/ee227a/Scribe/lecture21_final.pdf
On Jun 17, 12:12 am, "emre" <egu...@ece.neu.edu> wrote:
> >Emre > >Thanks for your reply. Let me see if I understood you correctly. > > >G=[0 v1' ; v2 F] > >Given Spectral Radius(F)<1, the condition for ensuring that > >Spectral Radius(G)<1, (or alternatively I-G is invertible) is that > > >1-v1' * (I-F)^{-1}* v2 >0 > > >Before I start looking through Schur complement literature, we do not > >need any constraints on G to be semidefinite, correct? > > >Thanks again! > > >Vikram > > Vikram, your understanding seems correct to me with one exception (see > below)... and no, G need not be semidefinite. > > >Spectral Radius(G)<1, (or alternatively I-G is invertible) is that >
Hi Emre I still would like you to clarify one point. You are stating I-G>0 as a condition for Spectral Radius(G)<1 and applying Schur complement lemma given in Wainwright's document Why does I-G have to be positive definite? Or, is my understanding incorrect? Thank you Vikram
>Hi Emre > >I still would like you to clarify one point. > >You are stating I-G>0 as a condition for Spectral Radius(G)<1 and >applying Schur complement lemma given in Wainwright's document Why >does I-G have to be positive definite? > >Or, is my understanding incorrect? >Thank you >Vikram
Hi Vikram, Good point. I realized that I-G>0 is not really the sufficient condition for Spectral Radius(G)= |G|<1. Much like the scalar case, |G|<1 implies I-G>0 and I+G>0 simultaneously. Here is my attempted proof: Let lambda_i be the i-th eigenvalue of G associated with eigenvector xi. Then, for each i, |G| < 1 => |lambda_i| < 1 => -1 < lambda_i < 1 => { xi' *G* xi = lambda_i* xi' * xi < 1*xi' * xi = xi' * I * xi => (I-G)>0 and xi' *G* xi = lambda_i* xi' * xi > -1*xi' * xi = xi' * -I * xi => (I+G)>0.} The "if" part of the proof requires slightly more work, namely, expanding any vector x in terms of xi's, which I will leave to the reader as an exercise. :-) Therefore, the condition for |G| < 1 is equivalent to (simultaneous holding of) I-G>0 and I+G>0. And this is equivalent (through Schur complement lemma) to 1-v1' * (I-F)^{-1}* v2 >0 and 1-v1' * (I+F)^{-1}* v2 >0. Emre
Emre:

Two things that I want to clarify in your proof:

First, aren't you assuming G to be hermitian, since you are assuming
the existence of a unitary basis of eigenvectors and real eigen
values? The quadratic forms of G+I and G-I are considered only with
eigenvectors of G.
So, you are assuming that these "N" eigenvectors form a basis for C^N,
right?

However, are I+G and I-G  still positive definite if G= [0 v1'; v2 F]
is non-symmetric and hence do not satisfy these conditions?

Thanks

Vikram






On Jun 17, 11:29 am, "emre" <egu...@ece.neu.edu> wrote:
> >Hi Emre > > >I still would like you to clarify one point. > > >You are stating I-G>0 as a condition for Spectral Radius(G)<1 and > >applying Schur complement lemma given in Wainwright's document Why > >does I-G have to be positive definite? > > >Or, is my understanding incorrect? > >Thank you > >Vikram > > Hi Vikram, > > Good point. I realized that I-G>0 is not really the sufficient condition > for Spectral Radius(G)= |G|<1. Much like the scalar case, |G|<1 implies > I-G>0 and I+G>0 simultaneously. Here is my attempted proof: > > Let lambda_i be the i-th eigenvalue of G associated with eigenvector xi. > Then, for each i, > > |G| < 1 => |lambda_i| < 1 => -1 < lambda_i < 1 > => > { xi' *G* xi = lambda_i* xi' * xi < 1*xi' * xi = xi' * I * xi => (I-G)>0 > > and > xi' *G* xi = lambda_i* xi' * xi > -1*xi' * xi = xi' * -I * xi => > (I+G)>0.} > > The "if" part of the proof requires slightly more work, namely, expanding > any vector x in terms of xi's, which I will leave to the reader as an > exercise. :-) > > Therefore, the condition for |G| < 1 is equivalent to (simultaneous > holding of) I-G>0 and I+G>0. > And this is equivalent (through Schur complement lemma) to > 1-v1' * (I-F)^{-1}* v2 >0 > and > 1-v1' * (I+F)^{-1}* v2 >0. > > Emre
Vikram, you are right in that the above proof applies only when
eigenvectors are real. The more general case requires a minor modification,
however:

|G| < 1  => |lambda_i| < 1
=> |xi' *G* xi| = | lambda_i* xi' * xi | 
                           < |lambda_i| *xi' *xi 
                           < xi' * I * xi
Since both sides are real, this implies 
        xi' *G* xi  < xi' * I * xi 
and
        xi' * I* xi < -xi' *G* xi .

Hence I-G>0  and  I+G>0 still hold for non-hermitian G as well. So the
above results regarding v1 and v2 still apply.

Emre

>Emre: > >Two things that I want to clarify in your proof: > >First, aren't you assuming G to be hermitian, since you are assuming >the existence of a unitary basis of eigenvectors and real eigen >values? The quadratic forms of G+I and G-I are considered only with >eigenvectors of G. >So, you are assuming that these "N" eigenvectors form a basis for C^N, >right? > >However, are I+G and I-G still positive definite if G= [0 v1'; v2 F] >is non-symmetric and hence do not satisfy these conditions? > >Thanks > >Vikram
>Since both sides are real, this implies > xi' *G* xi < xi' * I * xi >and > xi' * I* xi < -xi' *G* xi .
Correction: xi' * G * xi < xi' * I * xi and xi' * I * xi > -xi' * G * xi . Emre
Hi Emre

I think that you have given the right way to solve my problem for
addressing my research.

Two final queries:

a). In Wainwright's theorem, the off-diagonal entries in the X matrix
are the same (equaling B). This requirement is not necessary for the S-
Lemma, right? That applies to my case.

b). It looks like your proof to show that Spectral Radius(G)<1 implies
I+G and I-G are positive definite is also a necessary condition.

Thank you very much. It was a pleasure interacting with you.

Vikram



On Jun 17, 4:46 pm, "emre" <egu...@ece.neu.edu> wrote:
> Vikram, you are right in that the above proof applies only when > eigenvectors are real. The more general case requires a minor modification, > however: > > |G| < 1 => |lambda_i| < 1 > => |xi' *G* xi| = | lambda_i* xi' * xi | > < |lambda_i| *xi' *xi > < xi' * I * xi > Since both sides are real, this implies > xi' *G* xi < xi' * I * xi > and > xi' * I* xi < -xi' *G* xi . > > Hence I-G>0 and I+G>0 still hold for non-hermitian G as well. So the > above results regarding v1 and v2 still apply. > > Emre > > >Emre: > > >Two things that I want to clarify in your proof: > > >First, aren't you assuming G to be hermitian, since you are assuming > >the existence of a unitary basis of eigenvectors and real eigen > >values? The quadratic forms of G+I and G-I are considered only with > >eigenvectors of G. > >So, you are assuming that these "N" eigenvectors form a basis for C^N, > >right? > > >However, are I+G and I-G still positive definite if G= [0 v1'; v2 F] > >is non-symmetric and hence do not satisfy these conditions? > > >Thanks > > >Vikram