DSPRelated.com
Forums

OFDM

Started by manishp October 20, 2012
On 10/22/12 3:03 PM, glen herrmannsfeldt wrote:
> Randy Yates<yates@digitalsignallabs.com> wrote: >> eric.jacobsen@ieee.org (Eric Jacobsen) writes: > > > (snip on orthogonality) > >>> Stated more simply, if the dot product of two vectors is zero, then >>> they are orthogonal to each other. > >> That is one definition. There are others, e.g., two "continuous-time" >> signals f(t) and g(t) are said to be orthogonal over some interval T if > >> \int_{\tau}^{\tau + T} f(t) \cdot g(t) dt = 0, > >> where \tau is a given point in time (usually either 0 or -T/2). > > Well, as I understand the mathematics, first you extend the vector to > infinity and call it a function. (It might still have finite bounds, > but defined at an infinite number of points.)
but, to be completely anal about it, then you have to know that the function is Riemann integrable. some of them ain't, but i doubt anyone present will really need to deal with such nastiness.
> The same extension gets you from the Fourier series to the Fourier > transform.
yup.
> I still remember NOT learning the difference between Fourier series > and transform, as taught by my physics TA at 9:00 AM. That was the > first time I had to extend the idea of a vector to infinity, and > was too much to learn so quickly.
for me, the light bulb went on in my first communications course using the book by AB Carlson. the book demonstrated the idea nicely with a fixed width rectangular pulse periodically extended with the period going out to infinity. all doable if you can assume the improper integral is a limit of a proper integral and that everything is fine with Riemann. your series becomes a Riemann summation and then an Riemann integral. -- r b-j rbj@audioimagination.com "Imagination is more important than knowledge."
robert bristow-johnson <rbj@audioimagination.com> wrote:

>> (snip on orthogonality)
>>>> Stated more simply, if the dot product of two vectors is zero, then >>>> they are orthogonal to each other.
>>> That is one definition. There are others, e.g., two "continuous-time" >>> signals f(t) and g(t) are said to be orthogonal over some interval T if
>>> \int_{\tau}^{\tau + T} f(t) \cdot g(t) dt = 0,
>>> where \tau is a given point in time (usually either 0 or -T/2).
(then I wrote)
>> Well, as I understand the mathematics, first you extend the vector to >> infinity and call it a function. (It might still have finite bounds, >> but defined at an infinite number of points.)
> but, to be completely anal about it, then you have to know that the > function is Riemann integrable. some of them ain't, but i doubt anyone > present will really need to deal with such nastiness.
Well, either that or you need a weight function that I didn't put in before, but should really have. So, only the product of the two functions, times the weight function, has to be Riemann integrable, often due to the weight function alone.
>> The same extension gets you from the Fourier series to the Fourier >> transform.
> yup.
>> I still remember NOT learning the difference between Fourier series >> and transform, as taught by my physics TA at 9:00 AM. That was the >> first time I had to extend the idea of a vector to infinity, and >> was too much to learn so quickly.
> for me, the light bulb went on in my first communications course using > the book by AB Carlson.
> the book demonstrated the idea nicely with a fixed width rectangular > pulse periodically extended with the period going out to infinity. all > doable if you can assume the improper integral is a limit of a proper > integral and that everything is fine with Riemann. your series becomes > a Riemann summation and then an Riemann integral.
-- glen
On 10/22/2012 10:44 AM, manishp wrote:
>> Frequency elements a and b are orthogonal if >> >> +infinity >> INTEGRAL(f(a)*f(b))dt=0 >> -infinity > > Hello Jerry, > > Since convolution includes intgration already, does the above equation mean > there is a double integration. First due to convolution and then over the > resultant signal. Can you please clarify?
As far as I know, convolution is not involved with a definition of orthogonality. Jerry -- Engineering is the art of making what you want from things you can get. &#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;
eric.jacobsen@ieee.org (Eric Jacobsen) writes:

> On Mon, 22 Oct 2012 13:33:08 -0400, Randy Yates > <yates@digitalsignallabs.com> wrote: > >>eric.jacobsen@ieee.org (Eric Jacobsen) writes: >> >>> On Mon, 22 Oct 2012 09:44:34 -0500, "manishp" <58525@dsprelated> >>> wrote: >>> >>>>>Frequency elements a and b are orthogonal if >>>>> >>>>> +infinity >>>>> INTEGRAL(f(a)*f(b))dt=0 >>>>> -infinity >>>> >>>>Hello Jerry, >>>> >>>>Since convolution includes intgration already, does the above equation mean >>>>there is a double integration. First due to convolution and then over the >>>>resultant signal. Can you please clarify? >>>> >>>>Thanks, Manish ... >>> >>> Stated more simply, if the dot product of two vectors is zero, then >>> they are orthogonal to each other. >> >>That is one definition. There are others, e.g., two "continuous-time" >>signals f(t) and g(t) are said to be orthogonal over some interval T if >> >> \int_{\tau}^{\tau + T} f(t) \cdot g(t) dt = 0, >> >>where \tau is a given point in time (usually either 0 or -T/2). >>-- > > I don't know what script language that is or what it really says, but > it looks to me like it's either a dot product or inner product. If > so, it's essentially the same definition of orthogonality.
It's (La)TeX. Punch it in here to see: http://www.codecogs.com/latex/eqneditor.php Is summing a countable number of values the same as integration? If not, then this is a different definition. Same concept at some level, but at the base operations level it's different. -- Randy Yates Digital Signal Labs http://www.digitalsignallabs.com
On Mon, 22 Oct 2012 17:35:25 -0400, Jerry Avins wrote:

> On 10/22/2012 10:44 AM, manishp wrote: >>> Frequency elements a and b are orthogonal if >>> >>> +infinity >>> INTEGRAL(f(a)*f(b))dt=0 >>> -infinity >> >> Hello Jerry, >> >> Since convolution includes intgration already, does the above equation >> mean there is a double integration. First due to convolution and then >> over the resultant signal. Can you please clarify? > > As far as I know, convolution is not involved with a definition of > orthogonality. > > Jerry
Convolution is not involved with the definition of orthogonality. I suppose that you could come up with _a_ definition of orthogonality that included convolution, if you were in a perverse mood. -- My liberal friends think I'm a conservative kook. My conservative friends think I'm a liberal kook. Why am I not happy that they have found common ground? Tim Wescott, Communications, Control, Circuits & Software http://www.wescottdesign.com
On Mon, 22 Oct 2012 09:44:34 -0500, manishp wrote:

>>Frequency elements a and b are orthogonal if >> >> +infinity >> INTEGRAL(f(a)*f(b))dt=0 >> -infinity > > Hello Jerry, > > Since convolution includes intgration already, does the above equation > mean there is a double integration. First due to convolution and then > over the resultant signal. Can you please clarify?
How did convolution get into this? You were asking about orthogonality in the frequency domain -- that has no direct relationship to convolution. -- My liberal friends think I'm a conservative kook. My conservative friends think I'm a liberal kook. Why am I not happy that they have found common ground? Tim Wescott, Communications, Control, Circuits & Software http://www.wescottdesign.com
Tim Wescott <tim@seemywebsite.com> wrote:
> On Mon, 22 Oct 2012 09:44:34 -0500, manishp wrote:
>>>Frequency elements a and b are orthogonal if >>> >>> +infinity >>> INTEGRAL(f(a)*f(b))dt=0 >>> -infinity
>> Hello Jerry,
>> Since convolution includes intgration already, does the above equation >> mean there is a double integration. First due to convolution and then >> over the resultant signal. Can you please clarify?
> How did convolution get into this? You were asking about orthogonality > in the frequency domain -- that has no direct relationship to convolution.
Jerry used *, the symbol for multiply in most computer languages, but also similar to the symbol some use for convolution. -- glen
>Tim Wescott <tim@seemywebsite.com> wrote: >> On Mon, 22 Oct 2012 09:44:34 -0500, manishp wrote: > >>>>Frequency elements a and b are orthogonal if >>>> >>>> +infinity >>>> INTEGRAL(f(a)*f(b))dt=0 >>>> -infinity > >>> Hello Jerry, > >>> Since convolution includes intgration already, does the above equation >>> mean there is a double integration. First due to convolution and then >>> over the resultant signal. Can you please clarify? > >> How did convolution get into this? You were asking about orthogonality
>> in the frequency domain -- that has no direct relationship to
convolution.
> >Jerry used *, the symbol for multiply in most computer languages, >but also similar to the symbol some use for convolution. > >-- glen
Yes. I assumed asterisk operator above to be convolution function. My mistake indeed ... But coming back to main question. I do understand that sub-carriers are orthogonal, but question is in what way this property is useful? That is, orthogonality in sub-carriers ...
Randy Yates <yates@digitalsignallabs.com> writes:

> eric.jacobsen@ieee.org (Eric Jacobsen) writes: > >> On Mon, 22 Oct 2012 13:33:08 -0400, Randy Yates >> <yates@digitalsignallabs.com> wrote: >> >>>eric.jacobsen@ieee.org (Eric Jacobsen) writes: >>> >>>> On Mon, 22 Oct 2012 09:44:34 -0500, "manishp" <58525@dsprelated> >>>> wrote: >>>> >>>>>>Frequency elements a and b are orthogonal if >>>>>> >>>>>> +infinity >>>>>> INTEGRAL(f(a)*f(b))dt=0 >>>>>> -infinity >>>>> >>>>>Hello Jerry, >>>>> >>>>>Since convolution includes intgration already, does the above equation mean >>>>>there is a double integration. First due to convolution and then over the >>>>>resultant signal. Can you please clarify? >>>>> >>>>>Thanks, Manish ... >>>> >>>> Stated more simply, if the dot product of two vectors is zero, then >>>> they are orthogonal to each other. >>> >>>That is one definition. There are others, e.g., two "continuous-time" >>>signals f(t) and g(t) are said to be orthogonal over some interval T if >>> >>> \int_{\tau}^{\tau + T} f(t) \cdot g(t) dt = 0, >>> >>>where \tau is a given point in time (usually either 0 or -T/2). >>>-- >> >> I don't know what script language that is or what it really says, but >> it looks to me like it's either a dot product or inner product. If >> so, it's essentially the same definition of orthogonality. > > It's (La)TeX. Punch it in here to see: > > http://www.codecogs.com/latex/eqneditor.php > > Is summing a countable number of values the same as integration? If not, > then this is a different definition. Same concept at some level, but at > the base operations level it's different.
From [herstein, p.193]: DEFINITION The vector space V over F is said to be an _inner product space_ if there is DEFINED for any two vectors u, v \in V an element (u, v) in F such that [...]. So they're both inner products but each one has to be defined to satisfy these properties depending on the space. --Randy @book{herstein, title = "Topics in Algebra", author = "I.N. Herstein", publisher = "Wiley", edition = "second", year = "1975"} -- Randy Yates Digital Signal Labs http://www.digitalsignallabs.com
On 10/23/12 4:31 AM, manishp wrote:
> > Yes. I assumed asterisk operator above to be convolution function. > My mistake indeed ...
it's the problem with ASCII math (maybe one reason Randy used TeX script, but then you gotta know what "\cdot" means) in my opinion, the naked asterisk is an unfortunate choice for convolution in the textbooks and other lit. i would have put that asterisk in a little circle to make it look like is "more" than multiplication. in ASCII math, i surround it with parenths to make it look like something else: y(t) = h(t) (*) x(t) i am now trying to get out of the habit of using * for multiplication on these pages but sometimes it's unavoidable.
> > But coming back to main question. I do understand that sub-carriers are > orthogonal, but question is in what way this property is useful? > That is, orthogonality in sub-carriers ...
you can put different and unrelated information of similar bandwidth on the two different carriers if they are orthogonal. -- r b-j rbj@audioimagination.com "Imagination is more important than knowledge."