Reply by Cedron October 25, 20162016-10-25
[...snip...]
>>> >>> A small correction, I think you meant "orthonormal" instead of >>> "orthogonal". >>> >>> My answer was going to be a diagonal matrix. They are really easy to >>> invert. >>> >>> Ced >>> --------------------------------------- >>> Posted through http://www.DSPRelated.com >>> >> >>Sure thing, Ced. Could you, please, invert for me the following diagonal
>>matrix? >> >>1 0 0 >>0 1 0 >>0 0 0 >> >>I would really appreciate your effort! >> >>Kindly thanks, >>Gene
Alright Gene, you got me. Though, the topic question implies you are dealing with an invertible matrix. I would say that diagonal matrices are the easiest to tell whether they are invertible followed by matrices with identical rows or columns.
> >lulz > >"Orthonormal" applies to vector sets, although an orthonormal set can >be used to make an orthogonal matrix. > >Has nobody mentioned the identity matrix yet as an example of "easiest >to invert"?
It's been more than thirty years since I took Linear Algebra, so I had to look it up. You are absolutely correct, thanks for the correction to my inaccurate correction. It does seem to be poor nomenclature though. An orthogonal basis is not represented by an orthogonal matrix unless it is orthonormal. None of our "witty" answers fit the OP's specification of random coefficients. Ced --------------------------------------- Posted through http://www.DSPRelated.com
Reply by Christian Gollwitzer October 25, 20162016-10-25
Am 25.10.16 um 01:45 schrieb Gordon Sande:
> On 2016-10-24 22:59:50 +0000, Christian Gollwitzer said: > >> Am 25.10.16 um 00:18 schrieb Marcel Mueller: >>> On 24.10.16 11.00, pedro1492@lycos.com wrote: >>>> Simultaneous equations with random coefficients go through Gaussian >>>> elimination - awfully slow if you have 100+ rows. >>> >>> Sure? Numerical recipies recommends LU decomposition. >> >> LU decomposition is the same as Gaussia elimination. It is of the >> order O(N^3) flops for square matrices. The question is, what exactly >> means "slow". A 100x100 system is solved in a millisecond using LU >> decomposition on a desktop computer. "Slow" always depends on your >> expectations, whereas for a 1000x1000 system you'll need a second. For >> a dense system, you cannot do much better than that - however if you >> have a sparse system, then from a certain point on an iterative method >> like conjugate gradients will be faster. >> >> Christian > > The O(N^3) notation hides the important detail of the multiplying constant. > LU has a lower coefficient than what is usually called Gaussian > elimination.
Are you sure? The leading coefficient is 2/3 n^3 according to Wikipedia. What is the difference between Gaussian elimination and LU decomposition? I thought LU is just Gaussian elimination where you store the factors (which then form the L matrix), if you want to solve for another right hand side. The OP also wrote about solving a system of equations, so I hope he is not computing the explicit inverse of the matrix and then multiplying it with the right hand side. A big factor is also the use of the right library. There can be a factor of 1000 in speed between a naive implementation in C for linear algebra algorithms and a highly tuned library like openBLAS, Intel MKL or the like. Christian
Reply by Gordon Sande October 24, 20162016-10-24
On 2016-10-24 22:59:50 +0000, Christian Gollwitzer said:

> Am 25.10.16 um 00:18 schrieb Marcel Mueller: >> On 24.10.16 11.00, pedro1492@lycos.com wrote: >>> Simultaneous equations with random coefficients go through Gaussian >>> elimination - awfully slow if you have 100+ rows. >> >> Sure? Numerical recipies recommends LU decomposition. > > LU decomposition is the same as Gaussia elimination. It is of the order > O(N^3) flops for square matrices. The question is, what exactly means > "slow". A 100x100 system is solved in a millisecond using LU > decomposition on a desktop computer. "Slow" always depends on your > expectations, whereas for a 1000x1000 system you'll need a second. For > a dense system, you cannot do much better than that - however if you > have a sparse system, then from a certain point on an iterative method > like conjugate gradients will be faster. > > Christian
The O(N^3) notation hides the important detail of the multiplying constant. LU has a lower coefficient than what is usually called Gaussian elimination.
Reply by October 24, 20162016-10-24
On Tue, 25 Oct 2016 00:59:50 +0200, Christian Gollwitzer
<auriocus@gmx.de> wrote:

>Am 25.10.16 um 00:18 schrieb Marcel Mueller: >> On 24.10.16 11.00, pedro1492@lycos.com wrote: >>> Simultaneous equations with random coefficients go through Gaussian >>> elimination - awfully slow if you have 100+ rows. >> >> Sure? Numerical recipies recommends LU decomposition. > >LU decomposition is the same as Gaussia elimination. It is of the order >O(N^3) flops for square matrices. The question is, what exactly means >"slow". A 100x100 system is solved in a millisecond using LU >decomposition on a desktop computer. "Slow" always depends on your >expectations, whereas for a 1000x1000 system you'll need a second. For a >dense system, you cannot do much better than that - however if you have >a sparse system, then from a certain point on an iterative method like >conjugate gradients will be faster. > > Christian
I thought everybody was using SVD these days?
Reply by Christian Gollwitzer October 24, 20162016-10-24
Am 25.10.16 um 00:18 schrieb Marcel Mueller:
> On 24.10.16 11.00, pedro1492@lycos.com wrote: >> Simultaneous equations with random coefficients go through Gaussian >> elimination - awfully slow if you have 100+ rows. > > Sure? Numerical recipies recommends LU decomposition.
LU decomposition is the same as Gaussia elimination. It is of the order O(N^3) flops for square matrices. The question is, what exactly means "slow". A 100x100 system is solved in a millisecond using LU decomposition on a desktop computer. "Slow" always depends on your expectations, whereas for a 1000x1000 system you'll need a second. For a dense system, you cannot do much better than that - however if you have a sparse system, then from a certain point on an iterative method like conjugate gradients will be faster. Christian
Reply by October 24, 20162016-10-24
On Mon, 24 Oct 2016 23:41:47 +0300, Evgeny Filatov
<e.v.filatov@ieee.org> wrote:

>On 24.10.2016 21:35, Cedron wrote: >>> Am 24.10.16 um 11:00 schrieb pedro1492@lycos.com: >>>> Simultaneous equations with random coefficients go through Gaussian >>>> elimination - awfully slow if you have 100+ rows. >>>> I know Toeplitz, tridiagonal and pentadiagonal have special solutions >>>> that are much faster. Any other matrix that has a more efficient >>>> algorithm to solve? >>>> >>> Orthogonal matrix. It is just the transposition or complex conjugate. >>> Those are also important, think of DFT for instance, or coordinate >>> system transforms. >>> >>> Christian >> >> A small correction, I think you meant "orthonormal" instead of >> "orthogonal". >> >> My answer was going to be a diagonal matrix. They are really easy to >> invert. >> >> Ced >> --------------------------------------- >> Posted through http://www.DSPRelated.com >> > >Sure thing, Ced. Could you, please, invert for me the following diagonal >matrix? > >1 0 0 >0 1 0 >0 0 0 > >I would really appreciate your effort! > >Kindly thanks, >Gene
lulz "Orthonormal" applies to vector sets, although an orthonormal set can be used to make an orthogonal matrix. Has nobody mentioned the identity matrix yet as an example of "easiest to invert"?
Reply by Marcel Mueller October 24, 20162016-10-24
On 24.10.16 11.00, pedro1492@lycos.com wrote:
> Simultaneous equations with random coefficients go through Gaussian > elimination - awfully slow if you have 100+ rows.
Sure? Numerical recipies recommends LU decomposition.
> I know Toeplitz, tridiagonal and pentadiagonal have special solutions > that are much faster. Any other matrix that has a more efficient > algorithm to solve?
Of course, unity matrix, unitary matrix, orthogonal matrix ... ;-) Marcel
Reply by Evgeny Filatov October 24, 20162016-10-24
On 24.10.2016 21:35, Cedron wrote:
>> Am 24.10.16 um 11:00 schrieb pedro1492@lycos.com: >>> Simultaneous equations with random coefficients go through Gaussian >>> elimination - awfully slow if you have 100+ rows. >>> I know Toeplitz, tridiagonal and pentadiagonal have special solutions >>> that are much faster. Any other matrix that has a more efficient >>> algorithm to solve? >>> >> Orthogonal matrix. It is just the transposition or complex conjugate. >> Those are also important, think of DFT for instance, or coordinate >> system transforms. >> >> Christian > > A small correction, I think you meant "orthonormal" instead of > "orthogonal". > > My answer was going to be a diagonal matrix. They are really easy to > invert. > > Ced > --------------------------------------- > Posted through http://www.DSPRelated.com >
Sure thing, Ced. Could you, please, invert for me the following diagonal matrix? 1 0 0 0 1 0 0 0 0 I would really appreciate your effort! Kindly thanks, Gene
Reply by Cedron October 24, 20162016-10-24
>Am 24.10.16 um 11:00 schrieb pedro1492@lycos.com: >> Simultaneous equations with random coefficients go through Gaussian >> elimination - awfully slow if you have 100+ rows. >> I know Toeplitz, tridiagonal and pentadiagonal have special solutions >> that are much faster. Any other matrix that has a more efficient >> algorithm to solve? >> >Orthogonal matrix. It is just the transposition or complex conjugate. >Those are also important, think of DFT for instance, or coordinate >system transforms. > > Christian
A small correction, I think you meant "orthonormal" instead of "orthogonal". My answer was going to be a diagonal matrix. They are really easy to invert. Ced --------------------------------------- Posted through http://www.DSPRelated.com
Reply by Christian Gollwitzer October 24, 20162016-10-24
Am 24.10.16 um 11:00 schrieb pedro1492@lycos.com:
> Simultaneous equations with random coefficients go through Gaussian > elimination - awfully slow if you have 100+ rows. > I know Toeplitz, tridiagonal and pentadiagonal have special solutions > that are much faster. Any other matrix that has a more efficient > algorithm to solve? >
Orthogonal matrix. It is just the transposition or complex conjugate. Those are also important, think of DFT for instance, or coordinate system transforms. Christian