"Dave" <dspguy2@netscape.net> wrote in message
news:d656b993-fb08-4a41-8806-0bfa0978e5b2@d77g2000hsb.googlegroups.com...
> On Jun 28, 4:41 pm, Thomas Richter <t...@math.tu-berlin.de> wrote:
>
>> I afraid you don't understand - they *do* form a (Schauder-) basis of
>> L^2, provably. But this only implies convergence in L^2 (mean-square
>> error) sense, and not in pointwise sense. If you say "basis", you also
>> need to define "norm". Actually, elements in L^2 are *not* functions,
>> but equivalence classes of them where, specifically, function values
>> *at* a point do not make sense, so to say that it "does not converge
>> pointwise" makes little sense in an L^2 space.
>>
>> All this sounds like mathematical nit-picking, but it is important to
>> use a proper definition of words specifically in the
>> infinite-dimensional cases or you get lost easily.
>
> I will admit that this area is not my forte, so I am a bit out of my
> depth - so take what I say with a grain of salt. I agree with
> everything you've said., but I specifically didn't mention a norm.
> All I meant to do was point out that it doesn't converge at the one
> particular point i.e. the discontinuity.
>
> I'm sure there must exist some norm for which the fourier transform
> doesn't form a basis. Perhaps the infinity norm? Perhaps you can tell
> me?
>
> Cheers,
> Dave
It's the "basis set" that "forms the basis". I don't know if the norm
determines. To me it's always been about whether you can construct and
uniquely solve a set of equations on a set of points using that basis. The
objective of the equations / the norm / may not be important. But I'd not
want to comment on L3, L4, etc.
For example, you can use the Fourier cos/sin basis along with the infinity
norm - that's what the Remez exchange algorithm does or can do as in the
Parks-McClellan implementation... well, at least the cos basis.
So, the Fourier Transform yields an L2 approximation and the Remez exchange
and L(inf) approximation, etc.
Fred
Reply by Dave●July 3, 20082008-07-03
On Jun 28, 4:41 pm, Thomas Richter <t...@math.tu-berlin.de> wrote:
> I afraid you don't understand - they *do* form a (Schauder-) basis of
> L^2, provably. But this only implies convergence in L^2 (mean-square
> error) sense, and not in pointwise sense. If you say "basis", you also
> need to define "norm". Actually, elements in L^2 are *not* functions,
> but equivalence classes of them where, specifically, function values
> *at* a point do not make sense, so to say that it "does not converge
> pointwise" makes little sense in an L^2 space.
>
> All this sounds like mathematical nit-picking, but it is important to
> use a proper definition of words specifically in the
> infinite-dimensional cases or you get lost easily.
I will admit that this area is not my forte, so I am a bit out of my
depth - so take what I say with a grain of salt. I agree with
everything you've said., but I specifically didn't mention a norm.
All I meant to do was point out that it doesn't converge at the one
particular point i.e. the discontinuity.
I'm sure there must exist some norm for which the fourier transform
doesn't form a basis. Perhaps the infinity norm? Perhaps you can tell
me?
Cheers,
Dave
Reply by Thomas Richter●June 28, 20082008-06-28
Dave wrote:
>> Whether there are complete basis of function spaces is a matter of
>> believe (namely if you thrust the axiom of choice). For all practical
>> matters, Schauder bases like the free oscillations (cos/sin or exp(ix))
>> are good enough. IOW, you can approximate any function as close as you
>> wish with sinoids in the l^2 sense. They do not form a Schauder basis in
>> l^\infty sense of the piecewise continuous functions, that's true, but a
>> different statement.
>
> The fourier series doesn't converge at a the point of the
> discontinuity and there is always a finite error, and it cannot be
> made infinitely small - atleast from what I remember.
Look, "convergence" requires a norm, otherwise this word makes no sense.
And yes, indeed, the cos/sin series *do* converge to any (non-continuous
or continuous) function *in L^2 sense*. I never claimed that they do
converge in a *pointwise* sense, which is something entirely different.
> I agree with you in mostcases the representation is sufficient.
This is not a matter of "what happens in most cases", it is really a
matter of picking the mathematically suitable function space. (-:
>>> It is like trying
>>> to represent 3D space with only 2 vectors.
>> No, not at all. In 3D space, you will always have a positive error for
>> arbitrary vectors. Here, you can make the error infinitely small.
>
>
> Since the sin/cos bases doesn't form a basis for discontinuous
> functions my analogy still stands. To represent the discontinuous
> functions you start getting into the theory of distributions.
I afraid you don't understand - they *do* form a (Schauder-) basis of
L^2, provably. But this only implies convergence in L^2 (mean-square
error) sense, and not in pointwise sense. If you say "basis", you also
need to define "norm". Actually, elements in L^2 are *not* functions,
but equivalence classes of them where, specifically, function values
*at* a point do not make sense, so to say that it "does not converge
pointwise" makes little sense in an L^2 space.
All this sounds like mathematical nit-picking, but it is important to
use a proper definition of words specifically in the
infinite-dimensional cases or you get lost easily.
So long,
Thomas
Reply by SteveSmith●June 27, 20082008-06-27
>Hi,
>I m trying to learn more about gibbs phenomenon. I found various sites
>explaining about what is gibbs phenomenon but none of them epxlained the
>reason behind the phenomenon.
>Can anyone please explain the reason for the gibbs phenomenon or send me
>links about the same.
>Thanks in advance.
>
Here's some reading on the Gibbs effect.
Steve
http://www.dspguide.com/ch11/4.htm
P.S. download the pdf version from the page; the HTML graphic isn't
detailed enough.
Reply by Dave●June 27, 20082008-06-27
On Jun 27, 2:49 am, Thomas Richter <t...@math.tu-berlin.de> wrote:
> Dave schrieb:
>
> > On Jun 26, 3:52 am, "vasindagi" <vish...@gmail.com> wrote:
> >> Hi,
> >> I m trying to learn more about gibbs phenomenon. I found various sites
> >> explaining about what is gibbs phenomenon but none of them epxlained the
> >> reason behind the phenomenon.
> >> Can anyone please explain the reason for the gibbs phenomenon or send me
> >> links about the same.
> >> Thanks in advance.
>
> > They explanation about the limitation of not using higher frequencies
> > isn't really correct - even though it displays the same type of
> > behaviour. Even if you used an infinite number of frequencies the
> > result does not converge at that point i.e. the discontinuity.
>
> It doesn't converge *pointwise* at the discontinuity, but it does
> converge in l^2 sense, and it also converges pointwise at every epsilon
>
> > 0 around the discontinuity.
> > So what you're seeing is the limitation of sum of sinusoids
> > representation - it does not form a complete basis.
>
> Whether there are complete basis of function spaces is a matter of
> believe (namely if you thrust the axiom of choice). For all practical
> matters, Schauder bases like the free oscillations (cos/sin or exp(ix))
> are good enough. IOW, you can approximate any function as close as you
> wish with sinoids in the l^2 sense. They do not form a Schauder basis in
> l^\infty sense of the piecewise continuous functions, that's true, but a
> different statement.
The fourier series doesn't converge at a the point of the
discontinuity and there is always a finite error, and it cannot be
made infinitely small - atleast from what I remember.
I agree with you in mostcases the representation is sufficient.
> > It is like trying
> > to represent 3D space with only 2 vectors.
>
> No, not at all. In 3D space, you will always have a positive error for
> arbitrary vectors. Here, you can make the error infinitely small.
Since the sin/cos bases doesn't form a basis for discontinuous
functions my analogy still stands. To represent the discontinuous
functions you start getting into the theory of distributions.
Cheers,
David
Reply by Thomas Richter●June 27, 20082008-06-27
Nils schrieb:
> vasindagi schrieb:
>> Hi,
>> I m trying to learn more about gibbs phenomenon. I found various sites
>> explaining about what is gibbs phenomenon but none of them epxlained the
>> reason behind the phenomenon.
>
> You've got lots of reactions about Gibbs phenomeon over the last two days.
>
> I'd like to give a different perspective from the graphical point of
> view. At last: The Gibbs phenomeon is often percieved used when it comes
> to images, rarely if ever if you talk about sound.
>
>
> The sole reason why it is important in graphics is the fact that our
> eyes and brain is trained to detect edges. Whatever your mind makes up
> for you is one thing. What really drives your vision are the edges of
> what you see.
>
> Mathematical spoken: The brain/eyes does a high-pass filer on what you
> see and the vision pays attention on those things that have the highest
> derivation from the mean. What you see however is often an illusion made
> up by your mind.
Well, not really, Nils. The Gibbs phenomenon is real in the sense that
you can measure it. The visibility is actually not *at* the edge, where
visual masking will appear - I can show images where this type of
masking makes the phenomenon invisible, which is one of the reasons why
JPEG works so well.
However, the problem with Gibbs appears as soon as the oscillations
caused by it are well away from the edge, in an otherwise flat terrain
where no structure can mask them, and *then* they become very visible.
> The gibbs pheonomen is simply ringing in the passband. However, it
> creates something like a halo around the objects, and that effect
> triggers the edge-detection part of your brain. It does pay much more
> attention to it than it ought to.
That's a different part of the story - a matter of the contrast
sensitivity function - (or a passband filter, if you want to say so),
which visually amplifies specific frequencies. Gibbs becomes visible,
too, whenever its "implied" frequencies are low enough to fit into this
passband.
> A similar effect yet kinda different is the marching band effect that
> you'll percieve if you quantize an image from lets say 8 bit to 4 bit.
> You'll get ridges/steps in smooth gradients and you eye will - no matter
> what you do - focus on them.
Yup, that's quantization noise, a different structure and a different
origin. Here, artificial edges are created, and they are again very
visible because there is no structure around them that could mask the
edges away.
> Both things are well explained from a perceptive point of view. The
> gibbs pheonomeon in images is one of those cases where a lower per pixel
> error has a higher percieved error than it ought to.
Not really - it depends on *where* the phenomenon occurs. Typically, it
is masked by the edge itself, unless you see the oscillations too far
away from the edge. That's at least my experience (I've looked at too
many images, probably... :-)
So long,
Thomas
Reply by Andor●June 27, 20082008-06-27
On 27 Jun., 04:33, Ron N <ron.nichol...@gmail.com> wrote:
> On Jun 26, 7:02�pm, "Fred Marshall" <fmarshallx@remove_the_x.acm.org>
> wrote:
>
>
>
>
>
> > "Ron N" <ron.nichol...@gmail.com> wrote in message
>
> >news:3de35455-919f-4715-a986-e4ee80428c1c@z32g2000prh.googlegroups.com...
> > On Jun 26, 7:00 am, Andor <andor.bari...@gmail.com> wrote:
>
> > Typical windowing only helps remove the ringing near
> > discontinuous boundary conditions at the sides of the window.
> > You will still get ringing from discontinuities in the middle
> > of the window.
>
> > It might be possible to reduce or eliminate the Gibb's
> > phenomena by approaching the limit of the sinusoidal series
> > in a different manner. �Instead of just adding terms, one can
> > replace the discontinuity with a short continuous function
> > segment (trapezoidal edge, or other higher order polynomial),
> > and approach the limit by simultaneously both increasing the
> > number of terms and decreasing the span of the replaced
> > segment being approximated by those terms in proportion to
> > the number of terms, thus decreasing the L2 error of both
> > approximations together.
>
> > Ron,
>
> > I have never seen a "discontinuous window" unless you want to include the
> > gate function as a window as we always do. �But that's a limiting case of
> > really "no window".
>
> I was talking about the "temporal" gating of
> non-periodic-in-aperture-width signals, and
> discontinuities in the middle of the waveform,
> before applying an additional "window function".
>
> There needs to be a more widespread and commonly
> used glossary of the terminology.
Ron
I explicitely said that I was using the term "windowing" as in
"windowed FIR design". The discontinuous periodic function is the
frequency response, and we window the Fourier series exapansion of the
frequency response (the impulse response) to smooth the frequency
response.
This is equivalent to convolving the periodic function with a
smoothing kernel, so the ringing is removed everywhere, not only at
the "sides" (whatever that means).
Regards,
Andor
Reply by Thomas Richter●June 27, 20082008-06-27
Dave schrieb:
> On Jun 26, 3:52 am, "vasindagi" <vish...@gmail.com> wrote:
>> Hi,
>> I m trying to learn more about gibbs phenomenon. I found various sites
>> explaining about what is gibbs phenomenon but none of them epxlained the
>> reason behind the phenomenon.
>> Can anyone please explain the reason for the gibbs phenomenon or send me
>> links about the same.
>> Thanks in advance.
>
>
> They explanation about the limitation of not using higher frequencies
> isn't really correct - even though it displays the same type of
> behaviour. Even if you used an infinite number of frequencies the
> result does not converge at that point i.e. the discontinuity.
It doesn't converge *pointwise* at the discontinuity, but it does
converge in l^2 sense, and it also converges pointwise at every epsilon
> 0 around the discontinuity.
> So what you're seeing is the limitation of sum of sinusoids
> representation - it does not form a complete basis.
Whether there are complete basis of function spaces is a matter of
believe (namely if you thrust the axiom of choice). For all practical
matters, Schauder bases like the free oscillations (cos/sin or exp(ix))
are good enough. IOW, you can approximate any function as close as you
wish with sinoids in the l^2 sense. They do not form a Schauder basis in
l^\infty sense of the piecewise continuous functions, that's true, but a
different statement.
> It is like trying
> to represent 3D space with only 2 vectors.
No, not at all. In 3D space, you will always have a positive error for
arbitrary vectors. Here, you can make the error infinitely small.
So long,
Thomas
Reply by Ron N●June 26, 20082008-06-26
On Jun 26, 7:02�pm, "Fred Marshall" <fmarshallx@remove_the_x.acm.org>
wrote:
> "Ron N" <ron.nichol...@gmail.com> wrote in message
>
> news:3de35455-919f-4715-a986-e4ee80428c1c@z32g2000prh.googlegroups.com...
> On Jun 26, 7:00 am, Andor <andor.bari...@gmail.com> wrote:
>
> Typical windowing only helps remove the ringing near
> discontinuous boundary conditions at the sides of the window.
> You will still get ringing from discontinuities in the middle
> of the window.
>
> It might be possible to reduce or eliminate the Gibb's
> phenomena by approaching the limit of the sinusoidal series
> in a different manner. �Instead of just adding terms, one can
> replace the discontinuity with a short continuous function
> segment (trapezoidal edge, or other higher order polynomial),
> and approach the limit by simultaneously both increasing the
> number of terms and decreasing the span of the replaced
> segment being approximated by those terms in proportion to
> the number of terms, thus decreasing the L2 error of both
> approximations together.
>
> Ron,
>
> I have never seen a "discontinuous window" unless you want to include the
> gate function as a window as we always do. �But that's a limiting case of
> really "no window".
I was talking about the "temporal" gating of
non-periodic-in-aperture-width signals, and
discontinuities in the middle of the waveform,
before applying an additional "window function".
There needs to be a more widespread and commonly
used glossary of the terminology.
rhn
Reply by Fred Marshall●June 26, 20082008-06-26
"Ron N" <ron.nicholson@gmail.com> wrote in message
news:3de35455-919f-4715-a986-e4ee80428c1c@z32g2000prh.googlegroups.com...
On Jun 26, 7:00 am, Andor <andor.bari...@gmail.com> wrote:
Typical windowing only helps remove the ringing near
discontinuous boundary conditions at the sides of the window.
You will still get ringing from discontinuities in the middle
of the window.
It might be possible to reduce or eliminate the Gibb's
phenomena by approaching the limit of the sinusoidal series
in a different manner. Instead of just adding terms, one can
replace the discontinuity with a short continuous function
segment (trapezoidal edge, or other higher order polynomial),
and approach the limit by simultaneously both increasing the
number of terms and decreasing the span of the replaced
segment being approximated by those terms in proportion to
the number of terms, thus decreasing the L2 error of both
approximations together.
Ron,
I have never seen a "discontinuous window" unless you want to include the
gate function as a window as we always do. But that's a limiting case of
really "no window". I've certainly not seen one with discontinuities in the
middle.
Some supergained theoretical windows have spikes at the edges but they
aren't very practical anyway. Others of this sort have sinusoidal
properties. But these are *way* out there......
Most practial windows go to zero at the edges and maybe have a discontinuous
first derivative. Many have their first derivative zero as well.
Typical windowing is a type of lowpass filter (in frequency) and does reduce
the ringing at temporal discontinuities.
Fred