DSPRelated.com
Forums

Generalized Cross Product

Started by Greg Berchin December 8, 2016
On Fri, 09 Dec 2016 06:56:08 -0600, Greg Berchin wrote:

> On Thu, 8 Dec 2016 16:28:58 -0800 (PST), navaide <navaide@gmail.com> > wrote: > >>Here's an extremely simple but powerful way to generate an >>"N"-dimensional vector orthogonal not only to two but to the whole >>bunch. > > Yes, that looks like the wedge product. I saw several references to it, > but they all came with warnings that the vector derived in this manner > was in a different vector space than the vectors from which it was > derived. I haven't had enough time to dig into it deeply enough to grok > the implications. > > Thanks, > Greg
"In a different vector space from" and "orthogonal to" are, I'm 99.44% sure, the same statement. If you're seeking a third vector that's orthogonal to the two you have, with no particular care about the N-3 dimensions that you're not addressing*, and if N is in the 100's, then I think this determinant- finding approach is going to be much more computationally intensive than other approaches. You haven't mentioned if your two starting vectors are orthogonal -- are they? Assuming they aren't: Start with V1 and V2. You want to find V3 such that V1 dot V3 = 0 and V2 dot V3 = 0. Find V2a = V2 - V1 * (V1 dot V2)/ norm(V1)^2. V2a is (if I'm getting my math right) orthogonal to V1. If it's zero, bomb out or choose another V2. Arbitrarily choose a V3a. Find V3 = V3a - V1 * (V1 dot V3a) / norm(V1)^2 - V2a * (V2a dot V3a) / norm(V2a)^2. If V3 is zero, bomb out or choose another V3a. Otherwise, you're done. * which just seems weird to me, and was why I was questioning your approach earlier. Sorry, but it just _feels_ like you're missing something -- or maybe N-3 somethings. -- www.wescottdesign.com
On Thu, 08 Dec 2016 14:55:53 -0800, ethan wrote:

>> I have a pair of N-dimensional vectors, N large (hundreds), and I need >> to generate another N-dimensional vector that is orthogonal to the >> first two. It doesn't have to be unique; it doesn't even have to be >> perfectly orthogonal, as long as the normalized dot products are less >> than, oh, about 0.005. > > Just look at the first three indices. If those two 3-vectors are > nonparallel and nonzero, take their cross product. That gives you the > first three entries of your vector, and you can fill the rest with > zeros. > > If the cross product is zero, pick another set of three indices and try > again until it works.
That could get you a really sucky vector, if the three you pick are mostly parallel. -- Tim Wescott Wescott Design Services http://www.wescottdesign.com I'm looking for work -- see my website!
On Fri, 09 Dec 2016 11:37:06 -0600, Tim Wescott
<tim@seemywebsite.really> wrote:

>If you're seeking a third vector that's orthogonal to the two you have, >with no particular care about the N-3 dimensions that you're not >addressing*, and if N is in the 100's, then I think this determinant- >finding approach is going to be much more computationally intensive than >other approaches.
My hunch was the same about this.
>You haven't mentioned if your two starting vectors are orthogonal -- are >they? Assuming they aren't:
They weren't designed to be orthogonal, but by coincidence their normalized dot product is <0.0004. That's close enough to orthogonal for my purposes.
>Find V2a = V2 - V1 * (V1 dot V2)/ norm(V1)^2. V2a is (if I'm getting my >math right) orthogonal to V1.
Yes, now that I have re-reviewed my linear algebra, this is a standard orthogonal projection technique. Even though my current V2 is very nearly orthogonal to V1, I will probably do this to make that last little adjustment.
>Arbitrarily choose a V3a. > >Find >V3 = V3a - V1 * (V1 dot V3a) / norm(V1)^2 - > V2a * (V2a dot V3a) / norm(V2a)^2.
Right. My intent is to compute several of them, and then examine them for other characteristics that I have not listed here.
>* which just seems weird to me, and was why I was questioning your >approach earlier. Sorry, but it just _feels_ like you're missing >something -- or maybe N-3 somethings.
I have three communications channels, and I need orthogonal pilot symbols because there is significant crosstalk. Thanks again, Greg
On Fri, 09 Dec 2016 15:29:35 -0600, Greg Berchin
<gjberchin@chatter.net.invalid> wrote:

>On Fri, 09 Dec 2016 11:37:06 -0600, Tim Wescott ><tim@seemywebsite.really> wrote: > >>If you're seeking a third vector that's orthogonal to the two you have, >>with no particular care about the N-3 dimensions that you're not >>addressing*, and if N is in the 100's, then I think this determinant- >>finding approach is going to be much more computationally intensive than >>other approaches. > >My hunch was the same about this. > >>You haven't mentioned if your two starting vectors are orthogonal -- are >>they? Assuming they aren't: > >They weren't designed to be orthogonal, but by coincidence their >normalized dot product is <0.0004. That's close enough to orthogonal for >my purposes. > >>Find V2a = V2 - V1 * (V1 dot V2)/ norm(V1)^2. V2a is (if I'm getting my >>math right) orthogonal to V1. > >Yes, now that I have re-reviewed my linear algebra, this is a standard >orthogonal projection technique. Even though my current V2 is very >nearly orthogonal to V1, I will probably do this to make that last >little adjustment. > >>Arbitrarily choose a V3a. >> >>Find >>V3 = V3a - V1 * (V1 dot V3a) / norm(V1)^2 - >> V2a * (V2a dot V3a) / norm(V2a)^2. > >Right. My intent is to compute several of them, and then examine them >for other characteristics that I have not listed here. > >>* which just seems weird to me, and was why I was questioning your >>approach earlier. Sorry, but it just _feels_ like you're missing >>something -- or maybe N-3 somethings. > >I have three communications channels, and I need orthogonal pilot >symbols because there is significant crosstalk.
Oh, now I think I see what you're doing. FWIW, I usually do such things by exhaustive search or similar processes, but with several hundred dimensions that might get difficult. The benefit of an exhaustive (or sparsely "exhaustive") search is that you can catalog/classify a number of mutually "orthogonal" vectors of varying quality or sufficiency, especially if they also need other characteristics like autocorrelation behavior, etc. I've not had to do it with very long vectors like that before, though. I had to dig a little to see that Gram-Schmidt orthonormalization and orthogonalization are essentially the same thing. I had heard of it in the context of "orthonormalization" before, but expected it was, at worst, nearly the same thing.
> That could get you a really sucky vector, if the three you pick are=20 > mostly parallel.=20
Hence the "try again until it works" step. To spell out a simple algorithm for picking three workable indices: look fo= r an index where either V1 or V2 is nonzero. That's your first index. For t= he second index, you just need to make sure that the 2-vectors you get by l= ooking at these two indices are not parallel. For the third index you can p= ick any one you want. Voila, nonparallel 3-vectors. Given the stated requirements, I think it would be overkill to do any compu= tation involving every element of the vectors. -Ethan
On 12/09/16 16:29, Greg Berchin wrote:
> On Fri, 09 Dec 2016 11:37:06 -0600, Tim Wescott > <tim@seemywebsite.really> wrote: > >> If you're seeking a third vector that's orthogonal to the two you have, >> with no particular care about the N-3 dimensions that you're not >> addressing*, and if N is in the 100's, then I think this determinant- >> finding approach is going to be much more computationally intensive than >> other approaches. > > My hunch was the same about this. > >> You haven't mentioned if your two starting vectors are orthogonal -- are >> they? Assuming they aren't: > > They weren't designed to be orthogonal, but by coincidence their > normalized dot product is <0.0004. That's close enough to orthogonal for > my purposes. > >> Find V2a = V2 - V1 * (V1 dot V2)/ norm(V1)^2. V2a is (if I'm getting my >> math right) orthogonal to V1. > > Yes, now that I have re-reviewed my linear algebra, this is a standard > orthogonal projection technique. Even though my current V2 is very > nearly orthogonal to V1, I will probably do this to make that last > little adjustment. > >> Arbitrarily choose a V3a. >> >> Find >> V3 = V3a - V1 * (V1 dot V3a) / norm(V1)^2 - >> V2a * (V2a dot V3a) / norm(V2a)^2. > > Right. My intent is to compute several of them, and then examine them > for other characteristics that I have not listed here. > >> * which just seems weird to me, and was why I was questioning your >> approach earlier. Sorry, but it just _feels_ like you're missing >> something -- or maybe N-3 somethings. > > I have three communications channels, and I need orthogonal pilot > symbols because there is significant crosstalk. > > Thanks again, > Greg >
FWIW, if you make a matrix A whose (two) rows are the two vectors you start with, then you are asking for a method of finding an element of the nullspace (a.k.a. the kernel) of A, i.e. the set of solutions x of the equation Ax = 0. Algorithms for finding a basis for the nullspace of an arbitrary matrix are standard linear algebra (wikipedia gives one using Gaussian elimination on its page for kernels in linear algebra, and as other poster have indicated you can also use other matrix factorization algorithms to solve this); once you have a basis for the nullspace I'd expect it would be easier to look for vectors satisfying the additional conditions you alluded to. Hope this is helpful! Robert Beaudoin
On Sat, 10 Dec 2016 23:29:10 -0500, "Robert E. Beaudoin"
<rbeaudoin@comcast.net> wrote:

>FWIW, if you make a matrix A whose (two) rows are the two vectors you >start with, then you are asking for a method of finding an element of >the nullspace (a.k.a. the kernel) of A, i.e. the set of solutions x of >the equation Ax = 0.
Yes; once somebody reminded me that this was a linear algebra problem, and not just specifically a cross-product problem, it all fell into place. Thanks to all, Greg
On Sunday, December 11, 2016 at 7:57:51 AM UTC-5, Greg Berchin wrote:
> On Sat, 10 Dec 2016 23:29:10 -0500, "Robert E. Beaudoin" > <rbeaudoin@comcast.net> wrote: >=20 > >FWIW, if you make a matrix A whose (two) rows are the two vectors you=20 > >start with, then you are asking for a method of finding an element of=20 > >the nullspace (a.k.a. the kernel) of A, i.e. the set of solutions x of=
=20
> >the equation Ax =3D 0.=20 >=20 > Yes; once somebody reminded me that this was a linear algebra problem, > and not just specifically a cross-product problem, it all fell into > place. >=20 > Thanks to all, > Greg
With your two vectors you can form a Projection matrix - call it P, - it wi= ll look very similar to the matrix formula for least squares solution. You = can then easily find a projection matrix onto the corresponding null space = (or kernel) via (I-P) , where I is the identity matrix. You can then perfor= m an SVD - you will then have a set of orthogonal vectors - which are all o= rthogonal to them selves - so any linear combination of them would also sui= t your requirements. You can them choose a linear combination of them that = meets any other requirements you have.=20 Note - in this case the inner product would be zero (or at least to machine= and algorithm tolerance). Also this approach may be a little overkill for = want you actually need. A good book, that's also very readable, is Gilbert Strang's "Introduction t= o Linear Algebra" Hope that helps. Cheers, David
A randomly chosen vector will be orthogonal to both of them in high dimensional space to a minute error. If you need an orthogonal vector that is dependent on both the vectors then maybe you should play around with random projections.
I'll post this again (sorry):
https://drive.google.com/open?id=0BwsgMLjV0BnhOGNxOTVITHY1U28

If your 2 vectors are A and B then maybe O=RP(A)+RP(B) or do you want to entangle them in some non-linear way.
On Sat, 6 May 2017 01:40:58 -0700 (PDT), bitterlemon40@yahoo.ie wrote:

>A randomly chosen vector will be orthogonal to both of them in high dimensional space to a minute error.
I left out the constraints that all three need to have (nearly) the same magnitude and (nearly) the same peak-to-average ratio. It's a fixed-point implementation. Thank you, Greg