Stated in the form of a homework problem, but it's something I'm working on for real: Consider the similarity transform Q * A * Q' = B (where Q' = transpose of Q) where A is any real square matrix, and Q is unitary. This is linear in A, so in general we know there must be a matrix R that satisfies R * A = B. So -- is there a way to find R given Q? A general formula, perhaps? Even better, one that's expressed in matrix form? Does anyone have a suggestion for search terms that I might Google? Something more specific than "linear algebra", which is going to snow me under with stuff I already know? In this case the matrix A is orthogonal, although not orthonormal, if that restriction helps my cause. (I'd submit this to the applied math group, but when I checked it recently it had turned into a venue for flame wars -- any sensible questions would just go up in smoke). -- www.wescottdesign.com
Matrix Math, Similarity Transformations
Started by ●November 10, 2009
Reply by ●November 10, 20092009-11-10
On Nov 10, 12:12 pm, Tim Wescott <t...@seemywebsite.com> wrote:> Stated in the form of a homework problem, but it's something I'm working > on for real: > > Consider the similarity transform > > Q * A * Q' = B (where Q' = transpose of Q) > > where A is any real square matrix, and Q is unitary. > > This is linear in A, so in general we know there must be a matrix R that > satisfies > > R * A = B. > > So -- is there a way to find R given Q? A general formula, perhaps? > Even better, one that's expressed in matrix form? Does anyone have a > suggestion for search terms that I might Google? Something more specific > than "linear algebra", which is going to snow me under with stuff I > already know? > > In this case the matrix A is orthogonal, although not orthonormal, if > that restriction helps my cause. > > (I'd submit this to the applied math group, but when I checked it > recently it had turned into a venue for flame wars -- any sensible > questions would just go up in smoke). > > --www.wescottdesign.comTim, maybe I am missing something but... If A is orthogonal, it must be full rank and hence invertible. Thus, R = B*inv(A) satisfies your requirement. In terms of just Q and A, that would be: R = Q*A*conj(Q)*inv(A) Are you trying to get a closed form without inverting A?
Reply by ●November 10, 20092009-11-10
On Nov 10, 8:01�pm, Dilip Warrier <dili...@yahoo.com> wrote:> On Nov 10, 12:12 pm, Tim Wescott <t...@seemywebsite.com> wrote: > > Consider the similarity transform > > > Q * A * Q' = B (where Q' = transpose of Q) > > > where A is any real square matrix, and Q is unitary. > > > This is linear in A, so in general we know there must be a matrix R that > > satisfies > > > R * A = B. >..> Tim, maybe I am missing something but... > > If A is orthogonal, it must be full rank and hence invertible. > > Thus, R = B*inv(A) satisfies your requirement. > > In terms of just Q and A, that would be: > R = Q*A*conj(Q)*inv(A)If Q * A * Q' = B is "linear in A" I would take this as B is a function of A, B(A) = QAQ', and this function is supposed to be representable in the form of B(A) = RA for some R depending only on Q. I don't see why this should be possible (not saying it isn't, I just can't see why it should). Actually I would put some effort to find a counter example first. That could bring some light in.
Reply by ●November 10, 20092009-11-10
On Tue, 10 Nov 2009 10:01:30 -0800, Dilip Warrier wrote:> On Nov 10, 12:12 pm, Tim Wescott <t...@seemywebsite.com> wrote: >> Stated in the form of a homework problem, but it's something I'm >> working on for real: >> >> Consider the similarity transform >> >> Q * A * Q' = B (where Q' = transpose of Q) >> >> where A is any real square matrix, and Q is unitary. >> >> This is linear in A, so in general we know there must be a matrix R >> that satisfies >> >> R * A = B. >> >> So -- is there a way to find R given Q? A general formula, perhaps? >> Even better, one that's expressed in matrix form? Does anyone have a >> suggestion for search terms that I might Google? Something more >> specific than "linear algebra", which is going to snow me under with >> stuff I already know? >> >> In this case the matrix A is orthogonal, although not orthonormal, if >> that restriction helps my cause. >> >> (I'd submit this to the applied math group, but when I checked it >> recently it had turned into a venue for flame wars -- any sensible >> questions would just go up in smoke). >> >> --www.wescottdesign.com > > Tim, maybe I am missing something but... > > If A is orthogonal, it must be full rank and hence invertible. > > Thus, R = B*inv(A) satisfies your requirement. > > In terms of just Q and A, that would be: R = Q*A*conj(Q)*inv(A) > > Are you trying to get a closed form without inverting A?I want to know R (a) without having to compute B for a given A, and (b) in a general, symbolic form. -- www.wescottdesign.com
Reply by ●November 10, 20092009-11-10
On Nov 10, 8:25�pm, "Mr.Capsicum" <mr.capsi...@gmail.com> wrote:> If Q * A * Q' = B is "linear in A" I would take this as B is a > function of A, B(A) = QAQ', and this function is supposed to be > representable in the form of B(A) = RA for some R depending only on Q. > > I don't see why this should be possible (not saying it isn't, I just > can't see why it should). Actually I would put some effort to find a > counter example first. That could bring some light in.I poked around a bit and it seems that if A = [a,b;c,d] and Q in R^ {2x2} then every element of product QAQ' depends on all of a, b, c and d thus its impossible to find R which would give the same result and would depend only on Q. Is there a reason to think that QAQ' is linear?
Reply by ●November 10, 20092009-11-10
On Nov 10, 1:51 pm, Tim Wescott <t...@seemywebsite.com> wrote:> On Tue, 10 Nov 2009 10:01:30 -0800, Dilip Warrier wrote: > > On Nov 10, 12:12 pm, Tim Wescott <t...@seemywebsite.com> wrote: > >> Stated in the form of a homework problem, but it's something I'm > >> working on for real: > > >> Consider the similarity transform > > >> Q * A * Q' = B (where Q' = transpose of Q) > > >> where A is any real square matrix, and Q is unitary. > > >> This is linear in A, so in general we know there must be a matrix R > >> that satisfies > > >> R * A = B. > > >> So -- is there a way to find R given Q? A general formula, perhaps? > >> Even better, one that's expressed in matrix form? Does anyone have a > >> suggestion for search terms that I might Google? Something more > >> specific than "linear algebra", which is going to snow me under with > >> stuff I already know? > > >> In this case the matrix A is orthogonal, although not orthonormal, if > >> that restriction helps my cause. > > >> (I'd submit this to the applied math group, but when I checked it > >> recently it had turned into a venue for flame wars -- any sensible > >> questions would just go up in smoke). > > >> --www.wescottdesign.com > > > Tim, maybe I am missing something but... > > > If A is orthogonal, it must be full rank and hence invertible. > > > Thus, R = B*inv(A) satisfies your requirement. > > > In terms of just Q and A, that would be: R = Q*A*conj(Q)*inv(A) > > > Are you trying to get a closed form without inverting A? > > I want to know R (a) without having to compute B for a given A, and (b) > in a general, symbolic form. > > --www.wescottdesign.comOK, I think I understand now. I believe the flaw, as pointed out by the other poster, is in assuming that if a function f(A) is linear in the matrix A, then it can be represented as the multiplication of another matrix R with A. Thus, f(A) = R*A does not hold for all linear f. As a counterexample, consider g(A) = transpose(A). This is linear in A since g(alpha*A) = alpha*g(A) where alpha is a scalar and g(A+B) = g (A) + g(B). If there existed an R for g as assumed above, then using the special case of the identity matrix, g(I) = R*I = transpose(I) where I is an identity matrix. Solving, you have R = I. i.e. g(A) = A. But, A = transpose(A) is clearly incorrect for any non-symmetric matrix.
Reply by ●November 10, 20092009-11-10
On Tue, 10 Nov 2009 11:01:02 -0800, Mr.Capsicum wrote:> On Nov 10, 8:25 pm, "Mr.Capsicum" <mr.capsi...@gmail.com> wrote: >> If Q * A * Q' = B is "linear in A" I would take this as B is a function >> of A, B(A) = QAQ', and this function is supposed to be representable in >> the form of B(A) = RA for some R depending only on Q. >> >> I don't see why this should be possible (not saying it isn't, I just >> can't see why it should). Actually I would put some effort to find a >> counter example first. That could bring some light in. > > I poked around a bit and it seems that if A = [a,b;c,d] and Q in R^ > {2x2} then every element of product QAQ' depends on all of a, b, c and d > thus its impossible to find R which would give the same result and would > depend only on Q. > > Is there a reason to think that QAQ' is linear?Apparently not in general! I was thinking specifically in terms of matrix representations of quaternions, and the direction cosine matrix that you can make from them. In this case the expression _is_ linear, but there are some pretty severe restrictions on A, which appear to be necessary. Sigh. It looks like I'm going to have to crank through this element-by- element. I was hoping to avoid that. -- www.wescottdesign.com
Reply by ●November 10, 20092009-11-10
>On Tue, 10 Nov 2009 11:01:02 -0800, Mr.Capsicum wrote: > >> On Nov 10, 8:25 pm, "Mr.Capsicum" <mr.capsi...@gmail.com> wrote: >>> If Q * A * Q' = B is "linear in A" I would take this as B is afunction>>> of A, B(A) = QAQ', and this function is supposed to be representablein>>> the form of B(A) = RA for some R depending only on Q. >>> >>> I don't see why this should be possible (not saying it isn't, I just >>> can't see why it should). Actually I would put some effort to find a >>> counter example first. That could bring some light in. >> >> I poked around a bit and it seems that if A = [a,b;c,d] and Q in R^ >> {2x2} then every element of product QAQ' depends on all of a, b, c andd>> thus its impossible to find R which would give the same result andwould>> depend only on Q. >> >> Is there a reason to think that QAQ' is linear? > >Apparently not in general! > >I was thinking specifically in terms of matrix representations of >quaternions, and the direction cosine matrix that you can make from >them. In this case the expression _is_ linear, but there are some pretty>severe restrictions on A, which appear to be necessary.Could you give more details on your Q and A? It does seem some of your questions are phrased with the intent to have broader appeal, but that makes them more abstract than they need to be, and therefore more difficult to understand what you're really after. As pointed out in Dilip's counterexample, the fact that Q*A*Q' is linear in A doesn't mean you can represent it as R*A, with R a function only of Q, in the general case. Specific details seem crucial here.
Reply by ●November 10, 20092009-11-10
On Tue, 10 Nov 2009 15:37:56 -0600, Michael Plante wrote:>>On Tue, 10 Nov 2009 11:01:02 -0800, Mr.Capsicum wrote: >> >>> On Nov 10, 8:25 pm, "Mr.Capsicum" <mr.capsi...@gmail.com> wrote: >>>> If Q * A * Q' = B is "linear in A" I would take this as B is a > function >>>> of A, B(A) = QAQ', and this function is supposed to be representable > in >>>> the form of B(A) = RA for some R depending only on Q. >>>> >>>> I don't see why this should be possible (not saying it isn't, I just >>>> can't see why it should). Actually I would put some effort to find a >>>> counter example first. That could bring some light in. >>> >>> I poked around a bit and it seems that if A = [a,b;c,d] and Q in R^ >>> {2x2} then every element of product QAQ' depends on all of a, b, c and > d >>> thus its impossible to find R which would give the same result and > would >>> depend only on Q. >>> >>> Is there a reason to think that QAQ' is linear? >> >>Apparently not in general! >> >>I was thinking specifically in terms of matrix representations of >>quaternions, and the direction cosine matrix that you can make from >>them. In this case the expression _is_ linear, but there are some >>pretty > >>severe restrictions on A, which appear to be necessary. > > Could you give more details on your Q and A? It does seem some of your > questions are phrased with the intent to have broader appeal, but that > makes them more abstract than they need to be, and therefore more > difficult to understand what you're really after. As pointed out in > Dilip's counterexample, the fact that Q*A*Q' is linear in A doesn't mean > you can represent it as R*A, with R a function only of Q, in the general > case. Specific details seem crucial here.I was hoping that there _was_ a broader solution, but it appears that is not the case. In fact, it appears that the whole thing doesn't fly. In quaternion math, you can rotate a real 3-D vector arbitrarily by using it as the imaginary part of a quaternion, then multiplying it fore and aft by a unit-length rotation quaternion and it's conjugate. The result is guaranteed to be rotated in space, not rendered into 4-D, and not have it's length changed. Since quaternions can be expressed as matrices, this is where the QAQ' came from. You can also multiply that 3-D vector by a direction cosine matrix, b = Ra. Where I erred was in thinking that after the real 3-D vector is made into a quaternion and then a matrix, that the matrix operation RA (i.e. some analog of the DCM times the matrixified, quaternionified, 3-D vector) would hold -- it doesn't, so I didn't have to waste any more time figuring out how to find a nonexistent operation. Now I'm grinding through this stuff element by element, making progress, and being grateful that there are only 3 spatial dimensions (in common usage at least), which holds down the number of individual terms I need to juggle. -- www.wescottdesign.com
Reply by ●November 10, 20092009-11-10
In article <mL2dnb21wqbwcWTXnZ2dnUVZ_oVi4p2d@web-ster.com>, Tim Wescott <tim@seemywebsite.com> wrote:> I was hoping that there _was_ a broader solution, but it appears that is > not the case. In fact, it appears that the whole thing doesn't fly.Hi Tim, I'm very curious about the original problem. You seem to be trying to avoid computing something -- and that's a laudable goal... but you seem to be asking for help with attempted solutions. I'd really like to know what the starting point was. I may bring a different point of view to the problem. (Of course, if you're planning to patent the solution, you can't very well tell us.) vale, rip -- email address is r i p 1 AT c o m c a s t DOT n e t






