DSPRelated.com
Forums

Energy

Started by naebad October 8, 2006
Mark wrote:
> >> Even theoretically ideal Ls and Cs produce useful filters. The input >> goes reactive in the stop bands, so no power gets in. There's nothing >> that needs to be dissipated. >> >> > > The incident power in the stop band gets REFLECTED back to the source > where it is __usually__ dissipated in the source resistance of the > source.
However that may be, the filter doesn't dissipate it. In many cases, its generation is simply suppressed. Jerry -- "The rights of the best of men are secured only as the rights of the vilest and most abhorrent are protected." - Chief Justice Charles Evans Hughes, 1927 ���������������������������������������������������������������������
Eric Jacobsen wrote:
> On 8 Oct 2006 16:29:26 -0700, "naebad" <minnaebad@yahoo.co.uk> wrote: > > >Where does the energy (or Power) get dissipated in a digital filter? > >For example, consider a simple analogue R-C low-pass filter driven by > >band-limited white noise. The output power spectrum is given by > > > >Pout = |W(jw)|^2 * input noise power. Power is dissipated in the > >resistor. So for the digital equivalent wheree does the power go? > > > > > >Naebad > > That paradox used to bug me, too. My example was two black boxes, > one with an analog filter, and the other with a high-impedance DAC, > equivalent digital filter, and ADC. I think in the real world it's > hard to really do this experiment effectively, but conceptually it > took me a long time to bend my brain around it.
The digital black box won't work without including some sort of battery or attached power supply. The incoming energy is reflected or absorbed by the input impedance while information is being stored about that incoming energy. Then, energy from the battery or power supply can then be routed into the output impedance modulated somewhat by how the stored information bits were processed or otherwise munged. IMHO. YMMV. -- rhn A.T nicholson d.0.t C-o-M
naebad wrote:

> Where does the energy (or Power) get dissipated in a digital filter? > For example, consider a simple analogue R-C low-pass filter driven by > band-limited white noise. The output power spectrum is given by
> Pout = |W(jw)|^2 * input noise power. Power is dissipated in the > resistor. So for the digital equivalent wheree does the power go?
In terms of digital signals and signal processing, it is losing information that requires energy. A digital storage device, such as a flip-flop, must dissipate a certain amount of energy to lose its previous state before it can store new data. Consider a mechanical information storage device, such as a cup which may or may not contain a ball. To capture the ball in the cup, the incoming energy must be absorbed. It is possible at first to store the energy. If you want to forget the previous state, you will find that the stored energy must be dissipated. At a more fundamental level, it is necessary to dissipate power to stay above the thermal noise in the system. The favorite physics explanation for this is related to Maxwell's demon. There is an example of this in terms of information and energy in one of the Feynman lectures. -- glen
My two cents on this problem, if I am mistaken, please correct me:

* In the case of a fully ideal analog LC filter, there is no
dissipation of energy, no energy passes because in the point of
resonance (1/sqrt(LC)) all the energy is bouncing between the L and the
C. I repeat, this is the fully ideal case. This is what a previous post
refered as 'going fully reactive'.

* Now, for a digital filter, if we consider only theoretically, as
already pointed out, there is no dissipation of energy. What happens is
that you get a sequence of numbers in, and by mathematical properties,
another sequence of number gets out. The sequence that gets out is
modified by the operations of the digital filters, which again, by the
mathematical properties of the filter, change the frequency results.

Juan Pablo Narino

jnarino skrev:
> My two cents on this problem, if I am mistaken, please correct me: > > * In the case of a fully ideal analog LC filter, there is no > dissipation of energy, no energy passes because in the point of > resonance (1/sqrt(LC)) all the energy is bouncing between the L and the > C. I repeat, this is the fully ideal case. This is what a previous post > refered as 'going fully reactive'. > > * Now, for a digital filter, if we consider only theoretically, as > already pointed out, there is no dissipation of energy. What happens is > that you get a sequence of numbers in, and by mathematical properties, > another sequence of number gets out. The sequence that gets out is > modified by the operations of the digital filters, which again, by the > mathematical properties of the filter, change the frequency results. > >Juan Pablo Narino
This is exactly the distinction I have tried to emphasize in all my writing on analog vs digital filters, analog vs digital systems: Analog filters exist in the *physical* world and have to comply to *physical* laws. Digital filters are *mathematical* operations on *mathematical* "objects", and need to comply to *mathematical* laws. The distinction is that the laws of physics are a constrained subset of the laws of maths. While the laws of physics have to comply the laws of maths, the converse is not true. This is why digital filters can do stuff that is not possible with analog filters. More to the point what this thread is concerned, the concepts of "energy" and "power" make no sense in the mathematical world. BWhat the maths is concerned, both are "norms" that happen -- coincidentially! -- to mirror certain aspects of the physical world. I agree completely with Juan. Rune
jnarino wrote:
> My two cents on this problem, if I am mistaken, please correct me: > > * In the case of a fully ideal analog LC filter, there is no > dissipation of energy, no energy passes because in the point of > resonance (1/sqrt(LC)) all the energy is bouncing between the L and the > C. I repeat, this is the fully ideal case. This is what a previous post > refered as 'going fully reactive'. > > * Now, for a digital filter, if we consider only theoretically, as > already pointed out, there is no dissipation of energy. What happens is > that you get a sequence of numbers in, and by mathematical properties, > another sequence of number gets out. The sequence that gets out is > modified by the operations of the digital filters, which again, by the > mathematical properties of the filter, change the frequency results.
What kind of LC filter do you have in mind? If series L and C, it has zero impedance at resonance; if parallel, it is an open circuit there. Those are the only types I can thing of for which w^2LC = 1 defines the critical frequency directly. The kind of lossless filter I had in mind when I wrote my earlier posts included the classical cascaded T or pi sections, with or without m-derived (or double m-derived) end sections. For the record, the low-pass versions are lumped-constant approximations of transmission lines. The stop band begins at frequencies above which the approximation breaks down. As with other filters, the canonic low- to highpass ans low- to bandpass transformations create those other classes. Jerry -- "The rights of the best of men are secured only as the rights of the vilest and most abhorrent are protected." - Chief Justice Charles Evans Hughes, 1927 &#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;&#4294967295;
Rune Allnor wrote:

(snip)

> While the laws of physics have to comply the laws of maths, > the converse is not true. This is why digital filters can do stuff > that is not possible with analog filters.
I agree so far.
> More to the point what > this thread is concerned, the concepts of "energy" and "power" > make no sense in the mathematical world.
Entropy makes sense in the mathematical world, and entropy can be connected to energy. The first time I saw this discussed had to do with the energy of computation. Can you design a binary adder that, theoretically, uses no energy? If you represent bits through moving balls on ramps, in theory it is possible to design an adder using frictionless balls. A binary non-saturating adder does not lose any information. It is the process of forgetting that takes energy. In thermodynamic terms, it requires a non-reversible system, and that goes along with an increase in entropy and decrease in free energy.
> What the maths is > concerned, both are "norms" that happen -- coincidentially! -- to > mirror certain aspects of the physical world.
> I agree completely with Juan.
It is interesting that physical systems follow relatively simple mathematical equations. -- glen
glen herrmannsfeldt skrev:
> Rune Allnor wrote: > > (snip) > > > While the laws of physics have to comply the laws of maths, > > the converse is not true. This is why digital filters can do stuff > > that is not possible with analog filters. > > I agree so far. > > > More to the point what > > this thread is concerned, the concepts of "energy" and "power" > > make no sense in the mathematical world. > > Entropy makes sense in the mathematical world,
I thought about using entropy as a "digital world" analog to the physical energy. I did not post the claim, because the arguments were too easy to counter. The short version of the dicussion goes as follows: Claim: Filtering in digital domain reduces enropy, where filtering in the physical world reduces energy or power. Supporting argument: Consider the white noise sequence that hase a constand PSD over the bandwidth of the system. This has maximum entropy. Filter this sequence, and entropy reduces. Hence, an analogy between power and energy in analog donain, and entropy in digital domain has been established. Counter argument: Another filter will bring back entropy. The demonstration is the whitening filter used in speech encoders. No (passive) filter can bring back energy or power in analog domain. Hence, the analogy between entropy in digital domain, and power and energy in analiog domain, does not hold to scrutiny.
> and entropy > can be connected to energy.
Only because of practical matters due to us humans and our computers existing in a physical world, and our interest in maths. I don't see why a physical world is required for the existence of maths. But now we are enetering philosophical grounds I believe best left untouched.
> The first time I saw this discussed had to do with the > energy of computation. Can you design a binary adder > that, theoretically, uses no energy?
No. But does that mean that binary operations, as mathematical *concepts*, require energy to exist?
> If you represent > bits through moving balls on ramps, in theory it is > possible to design an adder using frictionless balls. > A binary non-saturating adder does not lose any information. > > It is the process of forgetting that takes energy. In > thermodynamic terms, it requires a non-reversible system, > and that goes along with an increase in entropy and > decrease in free energy.
This has tho do with physics and physical *realizations* of the maths and computations. I don't see these as necessary for the *existence* of maths. But again, such questions are best left unexplored. Rune
On Thu, 12 Oct 2006 00:32:53 -0700, Rune Allnor wrote:
> Another filter will bring back entropy. The demonstration is the > whitening filter used in speech encoders.
These don't actually produce information where there is none, though. The result is only "white" within the support of the original signal. "Whitening" a pure tone leaves you with a pure tone.
> No (passive) filter > can bring back energy or power in analog domain. Hence, the > analogy between entropy in digital domain, and power and energy > in analiog domain, does not hold to scrutiny.
No, it doesn't, but the relationship is closer than you're giving it credit.
>> and entropy >> can be connected to energy. > > Only because of practical matters due to us humans and our > computers existing in a physical world, and our interest in maths.
Entropy exists in the physical world, independent of maths. The laws of thermodynamics (which are, admitedly, usually written down in mathematical terms.)
> I don't see why a physical world is required for the existence > of maths. But now we are enetering philosophical grounds > I believe best left untouched.
Well, I've always thought (at least since I discovered it, in one of my undergrad classes) that the fact that the entropy that prevented perpetual motion machines and controls refrigerators was the same as the entropy measured by compression algorithms and encoding schemes, and that it could be measured in bits, was most profound. -- Andrew
Rune Allnor wrote:
> glen herrmannsfeldt skrev: >> Rune Allnor wrote: >> >> (snip) >> >>> While the laws of physics have to comply the laws of maths, >>> the converse is not true. This is why digital filters can do stuff >>> that is not possible with analog filters. >> I agree so far. >> >>> More to the point what >>> this thread is concerned, the concepts of "energy" and "power" >>> make no sense in the mathematical world. >> Entropy makes sense in the mathematical world, > > I thought about using entropy as a "digital world" analog > to the physical energy. I did not post the claim, because > the arguments were too easy to counter. The short version > of the dicussion goes as follows: > > Claim: > > Filtering in digital domain reduces enropy, where filtering in > the physical world reduces energy or power. > > Supporting argument: > > Consider the white noise sequence that hase a constand PSD > over the bandwidth of the system. This has maximum entropy. > > Filter this sequence, and entropy reduces. Hence, an analogy > between power and energy in analog donain, and entropy in > digital domain has been established. > > Counter argument: > > Another filter will bring back entropy. The demonstration is the > whitening filter used in speech encoders. No (passive) filter > can bring back energy or power in analog domain. Hence, the > analogy between entropy in digital domain, and power and energy > in analog domain, does not hold to scrutiny.
A passive physical filter can re-whiten the signal just as well as any other. If it removes energy in those places where the first filter didn't, you are back to a white signal. Its smaller in amplitude now, but just as chaotic. Surely a mere scaling factor doesn't mean we have not brought back the entropy? In the digital filter a mere scaling factor determines the actual size of the result in a similar way. Steve