I just discovered that not all matrices with all-positive eigenvalues are positive definite (dumb, yes, and I probably used to know it, yes, but none the less...). I'm doing some testing, and I need to reliably generate otherwise-random matrices that are (a) positive definite and (b) have predetermined eigenvalues. Anyone got any suggestions? I know how to do the part where I generate a vector or matrix of random numbers, I'm just not sure where to go from there... Thanks. -- www.wescottdesign.com
Matrix Math Question
Started by ●July 28, 2009
Reply by ●July 28, 20092009-07-28
On Tue, 28 Jul 2009 12:41:20 -0500, Tim Wescott wrote:> I just discovered that not all matrices with all-positive eigenvalues > are positive definite (dumb, yes, and I probably used to know it, yes, > but none the less...). > > I'm doing some testing, and I need to reliably generate otherwise-random > matrices that are (a) positive definite and (b) have predetermined > eigenvalues. > > Anyone got any suggestions? I know how to do the part where I generate > a vector or matrix of random numbers, I'm just not sure where to go from > there... > > Thanks.(telling me how to make an otherwise random ortho-normal matrix will suffice, I think). -- www.wescottdesign.com
Reply by ●July 28, 20092009-07-28
On 28 Jul, 19:41, Tim Wescott <t...@seemywebsite.com> wrote:> I just discovered that not all matrices with all-positive eigenvalues are > positive definite (dumb, yes, and I probably used to know it, yes, but > none the less...). > > I'm doing some testing, and I need to reliably generate otherwise-random > matrices that are (a) positive definite and (b) have predetermined > eigenvalues. > > Anyone got any suggestions? =A0I know how to do the part where I generate=a> vector or matrix of random numbers, I'm just not sure where to go from > there...The useful property of a positive definite matrix X is the eigenvalue decomposition: X =3D V'DV where V is an orhogonal matrix, V'V =3D I and D is diagonal, containing the eigenvalues. You already know the eigenvalues, so D is known. The question, then, is how to come up with a useful V of dimension N x N. What I would do: 1) Generate a random vector v of size N x 1 2) Normalize the vector v to length 1 3) Use a Gram-Schmidt expansion around v to find a complete N x N orthogonal basis V. 4) Once you have that basis V, you are done. In matlab the function QR performs step 3 when fed a vector. Rune
Reply by ●July 28, 20092009-07-28
On Tue, 28 Jul 2009 11:37:10 -0700, Rune Allnor wrote:> On 28 Jul, 19:41, Tim Wescott <t...@seemywebsite.com> wrote: >> I just discovered that not all matrices with all-positive eigenvalues >> are positive definite (dumb, yes, and I probably used to know it, yes, >> but none the less...). >> >> I'm doing some testing, and I need to reliably generate >> otherwise-random matrices that are (a) positive definite and (b) have >> predetermined eigenvalues. >> >> Anyone got any suggestions? I know how to do the part where I generate >> a vector or matrix of random numbers, I'm just not sure where to go >> from there... > > The useful property of a positive definite matrix X is the eigenvalue > decomposition: > > X = V'DV > > where V is an orhogonal matrix, > > V'V = I > > and D is diagonal, containing the eigenvalues. > > You already know the eigenvalues, so D is known. The question, then, is > how to come up with a useful V of dimension N x N. > > What I would do: > > 1) Generate a random vector v of size N x 1 2) Normalize the vector v to > length 1 3) Use a Gram-Schmidt expansion around v > to find a complete N x N orthogonal > basis V. > 4) Once you have that basis V, you are done. > > In matlab the function QR performs step 3 when fed a vector. > > RuneThanks Rune -- that's exactly what I needed. Scilab also uses 'qr', so I even knew that part after reading your post! -- www.wescottdesign.com
Reply by ●July 29, 20092009-07-29
Rune Allnor <allnor@tele.ntnu.no> writes:> On 28 Jul, 19:41, Tim Wescott <t...@seemywebsite.com> wrote: >> I just discovered that not all matrices with all-positive eigenvalues are >> positive definite (dumb, yes, and I probably used to know it, yes, but >> none the less...). >> >> I'm doing some testing, and I need to reliably generate otherwise-random >> matrices that are (a) positive definite and (b) have predetermined >> eigenvalues. >> >> Anyone got any suggestions? I know how to do the part where I generate a >> vector or matrix of random numbers, I'm just not sure where to go from >> there... > > The useful property of a positive definite matrix X > is the eigenvalue decomposition: > > X = V'DV > > where V is an orhogonal matrix, > > V'V = I > > and D is diagonal, containing the eigenvalues. > > You already know the eigenvalues, so D is known. > The question, then, is how to come up with a > useful V of dimension N x N. > > What I would do: > > 1) Generate a random vector v of size N x 1 > 2) Normalize the vector v to length 1 > 3) Use a Gram-Schmidt expansion around v > to find a complete N x N orthogonal > basis V. > 4) Once you have that basis V, you are done. > > In matlab the function QR performs step 3 > when fed a vector.Rune, My LA book [meyer] tells me that the QR factoriation of an NxM matrix A produces a matrix Q of dimension NxM and an upper-triangular matrix R of dimension MxM, so that A = QR. So by that definition, the QR factorization of the Nx1 vector A would be an Nx1 vector Q and a 1x1 scalar R. This also agrees with the notion of Gram-Schmidt orthogonalization for the case of a set of 1 Nx1 vector {A}. The space spanned by that vector is just range(A), and you'd expect that the dimension of such as space to be one. Since the columns of Q form a basis for range(A), you'd thus expect Q to have just one column. But, as you've intimated, that isn't what Matlab (or Octave) returns. It returns an NxN matrix Q and Nx1 vector R. What exactly is going on, then? --Randy @BOOK{meyer, title = "{Matrix Analysis and Applied Linear Algebra}", author = "{Carl~D.~Meyer}", publisher = "Society for Industrial and Applied Mathematics", year = "2000"} -- Randy Yates % "Maybe one day I'll feel her cold embrace, Digital Signal Labs % and kiss her interface, mailto://yates@ieee.org % til then, I'll leave her alone." http://www.digitalsignallabs.com % 'Yours Truly, 2095', *Time*, ELO
Reply by ●July 29, 20092009-07-29
On 29 Jul, 12:18, Randy Yates <ya...@ieee.org> wrote:> Rune Allnor <all...@tele.ntnu.no> writes: > > On 28 Jul, 19:41, Tim Wescott <t...@seemywebsite.com> wrote: > >> I just discovered that not all matrices with all-positive eigenvalues are > >> positive definite (dumb, yes, and I probably used to know it, yes, but > >> none the less...). > > >> I'm doing some testing, and I need to reliably generate otherwise-random > >> matrices that are (a) positive definite and (b) have predetermined > >> eigenvalues. > > >> Anyone got any suggestions? �I know how to do the part where I generate a > >> vector or matrix of random numbers, I'm just not sure where to go from > >> there... > > > The useful property of a positive definite matrix X > > is the eigenvalue decomposition: > > > X = V'DV > > > where V is an orhogonal matrix, > > > V'V �= I > > > and D is diagonal, containing the eigenvalues. > > > You already know the eigenvalues, so D is known. > > The question, then, is how to come up with a > > useful V of dimension N x N. > > > What I would do: > > > 1) Generate a random vector v of size N x 1 > > 2) Normalize the vector v to length 1 > > 3) Use a Gram-Schmidt expansion around v > > � �to find a complete N x N orthogonal > > � �basis V. > > 4) Once you have that basis V, you are done. > > > In matlab the function QR performs step 3 > > when fed a vector. > > Rune, > > My LA book [meyer] tells me that the QR factoriation of an NxM matrix A produces > a matrix Q of dimension NxM and an upper-triangular matrix R of > dimension MxM, so that A = QR. > > So by that definition, the QR factorization of the Nx1 vector A would be > an Nx1 vector Q and a 1x1 scalar R. > > This also agrees with the notion of Gram-Schmidt orthogonalization > for the case of a set of 1 Nx1 vector {A}. The space spanned by > that vector is just range(A), and you'd expect that the dimension > of such as space to be one. Since the columns of Q form a basis > for range(A), you'd thus expect Q to have just one column. > > But, as you've intimated, that isn't what Matlab (or Octave) returns. It > returns an NxN matrix Q and Nx1 vector R. What exactly is going on, > then?For an N x M matrix, the 'formally correct' QRD of N x M matrix X returns an N x M Q such that Q'Q = I_(M x M) and R is M x M. The Q is an orthonormal basis for the column space of X and R is a 'mixing matrix' that contains a recipe of how the columns of X are represented by the columns of Q. But there is no need to stop with only M columns of Q. The missing N-M columns represent the null space of X and might be of some interest in their own. Or as in this case, one might want an orthonormal NxN basis that is aligned with one or more given vectors. In that case, one continues the Gram-Schmidt expansion of the basis by adding random terms every time one 'runs out' of basis vectors. I suppose it's up to each and every reader to make up their minds on whether this addition to the QR decomposition still is formally or philosophically a QR decomposition; the reality is that the QRD is based on the Gram-Schmidt expansion, which is a very useful tool in its own right. Rune
Reply by ●July 31, 20092009-07-31
Rune Allnor <allnor@tele.ntnu.no> writes:> On 29 Jul, 12:18, Randy Yates <ya...@ieee.org> wrote: >> Rune Allnor <all...@tele.ntnu.no> writes: >> > On 28 Jul, 19:41, Tim Wescott <t...@seemywebsite.com> wrote: >> >> I just discovered that not all matrices with all-positive eigenvalues are >> >> positive definite (dumb, yes, and I probably used to know it, yes, but >> >> none the less...). >> >> >> I'm doing some testing, and I need to reliably generate otherwise-random >> >> matrices that are (a) positive definite and (b) have predetermined >> >> eigenvalues. >> >> >> Anyone got any suggestions? I know how to do the part where I generate a >> >> vector or matrix of random numbers, I'm just not sure where to go from >> >> there... >> >> > The useful property of a positive definite matrix X >> > is the eigenvalue decomposition: >> >> > X = V'DV >> >> > where V is an orhogonal matrix, >> >> > V'V = I >> >> > and D is diagonal, containing the eigenvalues. >> >> > You already know the eigenvalues, so D is known. >> > The question, then, is how to come up with a >> > useful V of dimension N x N. >> >> > What I would do: >> >> > 1) Generate a random vector v of size N x 1 >> > 2) Normalize the vector v to length 1 >> > 3) Use a Gram-Schmidt expansion around v >> > to find a complete N x N orthogonal >> > basis V. >> > 4) Once you have that basis V, you are done. >> >> > In matlab the function QR performs step 3 >> > when fed a vector. >> >> Rune, >> >> My LA book [meyer] tells me that the QR factoriation of an NxM matrix A produces >> a matrix Q of dimension NxM and an upper-triangular matrix R of >> dimension MxM, so that A = QR. >> >> So by that definition, the QR factorization of the Nx1 vector A would be >> an Nx1 vector Q and a 1x1 scalar R. >> >> This also agrees with the notion of Gram-Schmidt orthogonalization >> for the case of a set of 1 Nx1 vector {A}. The space spanned by >> that vector is just range(A), and you'd expect that the dimension >> of such as space to be one. Since the columns of Q form a basis >> for range(A), you'd thus expect Q to have just one column. >> >> But, as you've intimated, that isn't what Matlab (or Octave) returns. It >> returns an NxN matrix Q and Nx1 vector R. What exactly is going on, >> then?Hi Rune, Thank you for your quick response. Sorry for the delay, but I have another question or two.> For an N x M matrix, the 'formally correct' QRD of N x M > matrix X returns an N x M Q such that > > Q'Q = I_(M x M) > > and R is M x M. > > The Q is an orthonormal basis for the column space of X > and R is a 'mixing matrix' that contains a recipe of > how the columns of X are represented by the columns of Q.Right.> But there is no need to stop with only M columns of Q. > The missing N-M columns represent the null space of > X and might be of some interest in their own. Or as in > this case, one might want an orthonormal NxN basis that > is aligned with one or more given vectors. > > In that case, one continues the Gram-Schmidt expansion > of the basis by adding random terms every time one > 'runs out' of basis vectors.So, given M basis vectors in B, M < N, Matlab computes a new basis vector by somehow "randomly" choosing values from the underlying field? Wouldn't this random selection affect the resulting full NxN "random" matrix? I.e., by the choice of distribution used in the random selection?> I suppose it's up to each and every reader to make up > their minds on whether this addition to the QR > decomposition still is formally or philosophically a > QR decomposition; the reality is that the QRD is > based on the Gram-Schmidt expansion, which is a very > useful tool in its own right.I won't argue this point here - I see the utility. -- Randy Yates % "And all that I can do Digital Signal Labs % is say I'm sorry, mailto://yates@ieee.org % that's the way it goes..." http://www.digitalsignallabs.com % Getting To The Point', *Balance of Power*, ELO
Reply by ●July 31, 20092009-07-31
On 31 Jul, 18:58, Randy Yates <ya...@ieee.org> wrote:> Rune Allnor <all...@tele.ntnu.no> writes:> > In that case, one continues the Gram-Schmidt expansion > > of the basis by adding random terms every time one > > 'runs out' of basis vectors. > > So, given M basis vectors in B, M < N, Matlab computes > a new basis vector by somehow "randomly" choosing values > from the underlying field?Once you reach the point where the coefficients of the R matrix vanish, one has a basis Q = span{X}. One way to proceed is to 'append' random columns to X, and proceed with the Gram-Schmidt expansion, but with the modification that one no longer stores the corresponding coefficients in R. That way one ends up with a full-rank Q but a modified R matrix, Rm, on the form Rm = [R, 0]'.> Wouldn't this random selection affect the resulting full > NxN "random" matrix? I.e., by the choice of distribution > used in the random selection?Only to a certain extent. It is easiest to show a hands-on example: If you start out with the vector X = [1,0,0]' then Q = [1, 0, 0]' and R = [1]. The basis for null{Q} is then P = [0, 0; 1, 0; 0, 1].; However, the column vectors of P can apepar in any order, and one can also flip directions of the basis vectors: P1 = [0, 0; 0,-1; -1, 0]; The randomness affects such details, but not the big picture. Rune
Reply by ●July 31, 20092009-07-31
Rune Allnor <allnor@tele.ntnu.no> writes:> On 31 Jul, 18:58, Randy Yates <ya...@ieee.org> wrote: >> Rune Allnor <all...@tele.ntnu.no> writes: > >> > In that case, one continues the Gram-Schmidt expansion >> > of the basis by adding random terms every time one >> > 'runs out' of basis vectors. >> >> So, given M basis vectors in B, M < N, Matlab computes >> a new basis vector by somehow "randomly" choosing values >> from the underlying field? > > Once you reach the point where the coefficients of the > R matrix vanish, one has a basis Q = span{X}. One way > to proceed is to 'append' random columns to X, and > proceed with the Gram-Schmidt expansion, but with the > modification that one no longer stores the corresponding > coefficients in R. That way one ends up with a full-rank Q > but a modified R matrix, Rm, on the form > > Rm = [R, 0]'. > >> Wouldn't this random selection affect the resulting full >> NxN "random" matrix? I.e., by the choice of distribution >> used in the random selection? > > Only to a certain extent. It is easiest to show a hands-on > example: If you start out with the vector X = [1,0,0]' > then Q = [1, 0, 0]' and R = [1]. > > The basis for null{Q} is then > > P = [0, 0; > 1, 0; > 0, 1].; > > However, the column vectors of P can apepar in any order, > and one can also flip directions of the basis vectors: > > P1 = [0, 0; > 0,-1; > -1, 0]; > > The randomness affects such details, but not the big > picture.I think I understand. In effect, the M vectors *define* the N-M vectors, direction and order notwithstanding, given they have to be orthogonal and normal to the range space. Thinking through this a little more also answers another question: wouldn't the choice of M have something to do with the "randomness" of the resulting space? The answer is no, for the same reason as above. The order might be different, but I'm not considering order. Essentially the first vector defines the whole matrix, and that's why the procedure you defined in which you only have to define one random vector suffices to generate a perfectly random NxN matrix. -- Randy Yates % "Though you ride on the wheels of tomorrow, Digital Signal Labs % you still wander the fields of your mailto://yates@ieee.org % sorrow." http://www.digitalsignallabs.com % '21st Century Man', *Time*, ELO
Reply by ●July 31, 20092009-07-31
On 31 Jul, 21:15, Randy Yates <ya...@ieee.org> wrote:> Rune Allnor <all...@tele.ntnu.no> writes: > > On 31 Jul, 18:58, Randy Yates <ya...@ieee.org> wrote: > >> Rune Allnor <all...@tele.ntnu.no> writes: > > >> > In that case, one continues the Gram-Schmidt expansion > >> > of the basis by adding random terms every time one > >> > 'runs out' of basis vectors. > > >> So, given M basis vectors in B, M < N, Matlab computes > >> a new basis vector by somehow "randomly" choosing values > >> from the underlying field? > > > Once you reach the point where the coefficients of the > > R matrix vanish, one has a basis Q = span{X}. One way > > to proceed is to 'append' random columns to X, and > > proceed with the Gram-Schmidt expansion, but with the > > modification that one no longer stores the corresponding > > coefficients in R. That way one ends up with a full-rank Q > > but a modified R matrix, Rm, on the form > > > Rm = [R, 0]'. > > >> Wouldn't this random selection affect the resulting full > >> NxN "random" matrix? I.e., by the choice of distribution > >> used in the random selection? > > > Only to a certain extent. It is easiest to show a hands-on > > example: If you start out with the vector X = [1,0,0]' > > then Q = [1, 0, 0]' and R = [1]. > > > The basis for null{Q} is then > > > P = [0, 0; > > � � �1, 0; > > � � �0, 1].; > > > However, the column vectors of P can apepar in any order, > > and one can also flip directions of the basis vectors: > > > P1 = [0, 0; > > � � � 0,-1; > > � � �-1, 0]; > > > The randomness affects such details, but not the big > > picture. > > I think I understand. In effect, the M vectors *define* the N-M vectors, > direction and order notwithstanding, given they have to be orthogonal > and normal to the range space.Correct.> Thinking through this a little more also answers another question: > wouldn't the choice of M have something to do with the "randomness" of > the resulting space? The answer is no, for the same reason as above. The > order might be different, but I'm not considering order. �Essentially > the first vector defines the whole matrix, and that's why the procedure > you defined in which you only have to define one random vector suffices > to generate a perfectly random NxN matrix.Correct. Rune






