DSPRelated.com
Forums

Energy

Started by naebad October 8, 2006
Oli Charlesworth wrote:

(snip on energy and digital filters)

> However, the amount of energy required is unrelated to the input signal. > The number of bits that alter (and hence energy that is consumed) from > sample point to sample point is essentially uncorrelated with any > characteristic such as the frequency response, or even amplitude, of the > input signal.
It mostly comes out when trying to build a system with minimal operating power.
> And if we could somehow implement the filter in a different numerical > base, such as ternary, the number of ternary digits (tits?) that would > change would be unrelated to the number that changed in the binary system.
There are stories based on those assumptions that show that base e is optimal, and that three is the closest. As far as I know, there is no currently available electronics to implement such a system. -- glen
Rune Allnor wrote:

(I wrote)

>>dU=T dS The connection between energy and entropy is temperature. >>At constant temperature they are proportional.
> And the temperature relation between, say, an ASCII text and its > corresponding Huffman code is...?
The question only becomes important when trying to design minimal power (or energy) systems. A binary memory unit needs a bistable system to store one bit. There is some probability of thermal noise causing a bit transition, which increases with temperature or with a lower threshold for the bistable system. Current systems operate so far above the thermal noise limit that it is hard to see the connection. It will take fewer bits to store the huffman code data, but it will be more sensitive to bit changes. Those bit changes increase with temperature. -- glen
glen herrmannsfeldt skrev:
> Rune Allnor wrote: > > (I wrote) > > >>dU=T dS The connection between energy and entropy is temperature. > >>At constant temperature they are proportional. > > > And the temperature relation between, say, an ASCII text and its > > corresponding Huffman code is...? > > The question only becomes important when trying to design minimal power > (or energy) systems.
And these are *physical* systems. The Huffman code "exists" in a purely mathematical setting. My point was that thereis a difference between physics and mathematgics. "Energy" and "temperature" only make sense in the realm of physics. "Entropy" makes sense as a purely mathematical concept as well. Rune
glen herrmannsfeldt wrote:
> Oli Charlesworth wrote: > > (snip on energy and digital filters) > > > However, the amount of energy required is unrelated to the input signal. > > The number of bits that alter (and hence energy that is consumed) from > > sample point to sample point is essentially uncorrelated with any > > characteristic such as the frequency response, or even amplitude, of the > > input signal. > > It mostly comes out when trying to build a system with minimal > operating power.
How do you mean?
> > And if we could somehow implement the filter in a different numerical > > base, such as ternary, the number of ternary digits (tits?) that would > > change would be unrelated to the number that changed in the binary system. > > There are stories based on those assumptions that show that base e is > optimal, and that three is the closest. As far as I know, there is no > currently available electronics to implement such a system.
Optimal in what sense? -- Oli
Oli Filth wrote:

> glen herrmannsfeldt wrote:
(snip on energy and entropy)
>>It mostly comes out when trying to build a system with minimal >>operating power.
> How do you mean?
Someone could ask about building a computer that used zero power. A reversible system in thermodynamics terms. Some operations can be done reversibly, some can't. Assuming you want to do some that can't, such as store data in memory, what is the smallest amount of power that can be used to run such a computer? (There are people in IEEE who want to build a computer running on less than one microwatt.)
>>There are stories based on those assumptions that show that base e is >>optimal, and that three is the closest. As far as I know, there is no >>currently available electronics to implement such a system.
> Optimal in what sense?
I don't remember anymore, and I wasn't all that convinced. Here is what I found in Google. http://www.americanscientist.org/template/AssetDetail/assetid/14405?&print=yes -- glen