DSPRelated.com
Forums

information theory?

Started by RichD November 1, 2011
Are there any boards, or private blogs, dedicated
to discussion of information theory?

Rich
On Mon, 31 Oct 2011 20:55:57 -0700 (PDT), RichD
<r_delaney2001@yahoo.com> wrote:

>Are there any boards, or private blogs, dedicated >to discussion of information theory? > >Rich
Depending on the question comp.dsp is not a bad place. There are a fair number of comm people there. Eric Jacobsen Anchor Hill Communications www.anchorhill.com
Do you have a specific question? Or just want to lurk and learn?

Information Theory was the hardest class I took in grad school. So
abstract, and so much probability.... but it's a cool topic
nonetheless.

On Nov 1, 12:57=A0am, eric.jacob...@ieee.org (Eric Jacobsen) wrote:
> On Mon, 31 Oct 2011 20:55:57 -0700 (PDT), RichD > > <r_delaney2...@yahoo.com> wrote: > >Are there any boards, or private blogs, dedicated > >to discussion of information theory? > > >Rich > > Depending on the question comp.dsp is not a bad place. =A0 There are a > fair number of comm people there. > > Eric Jacobsen > Anchor Hill Communicationswww.anchorhill.com
On Nov 1, Chris <chris.sant...@gmail.com> wrote:
> Do you have a specific question? Or just want to lurk and learn?
I'd like to lurk someplace, where there are discussions of current topics. It's such a rich field, there are always new ideas.
> Information Theory was the hardest class I took in grad school. So > abstract, and so much probability.... but it's a cool topic > nonetheless.
I read Shannon's paper long go, and found it fascinating ever since. In the most general sense, it's applicable to so many areas. For instance, people have recast thermodynamics as information processing. Or, you could model the entire universe as a computer, and information theory applies. It even describes human nature - manipulating symbols is what separates us from the apes.
> > >Are there any boards, or private blogs, dedicated > > >to discussion of information theory?
-- Rich
On Nov 1, 1:29=A0pm, RichD <r_delaney2...@yahoo.com> wrote:
> On Nov 1, Chris <chris.sant...@gmail.com> wrote: > > > Do you have a specific question? Or just want to lurk and learn? > > I'd like to lurk someplace, where there are discussions > of current topics. =A0It's such a rich field, there are always > new ideas. > > > Information Theory was the hardest class I took in grad school. So > > abstract, and so much probability.... but it's a cool topic > > nonetheless. > > I read Shannon's paper long go, and found it fascinating > ever since. =A0In the most general sense, it's applicable > to so many areas. =A0For instance, people have recast > thermodynamics as information processing. =A0Or, you > could model the entire universe as a computer, and > information theory applies.
Well, you can. But it's still going to take the people working in thermodynamics another several hundred years to discover that when base all your science on Classical Geometry, you will always come to that conclusion, regardless of the technology. Which still doesn't imply that the people who understand number theory aren't going to be working on post pyramid compilers, rather than more sandstone libraries for them. The people who work in engineering will be working on lasers, holographics, nanotechnolgy, and self- replicating machines. Regardless of what the jerks do with flourescent lightbulbs. And the people who working in signal processing will be working on 21st Helicopters, regardless of what any of the clowns do with radar. =A0It even describes human
> nature - manipulating symbols is what separates us > from the apes. > > > > >Are there any boards, or private blogs, dedicated > > > >to discussion of information theory? > > -- > Rich
On 11/1/11 1:29 PM, RichD wrote:
> On Nov 1, Chris<chris.sant...@gmail.com> wrote: >> Do you have a specific question? Or just want to lurk and learn? > > I'd like to lurk someplace, where there are discussions > of current topics. It's such a rich field, there are always > new ideas. > >> Information Theory was the hardest class I took in grad school. So >> abstract, and so much probability.... but it's a cool topic >> nonetheless. > > I read Shannon's paper long go, and found it fascinating > ever since. In the most general sense, it's applicable > to so many areas. For instance, people have recast > thermodynamics as information processing.
even though i have a *feel* for the connection to thermodynamic entropy, i really do not understand what the mathematical connection is between SUM{ p{M_i} * I{M_i} } i where M_i = the i_th message p{M_i} is the probability of the i_th message I{M_i} is the information content of the i_th message and is equal to -log( p{M_i} ) (the base of the log only changes the units that information is measured with. log2() means information measured in "bits".) now how is that above summation related to thermodynamic entropy which is integral{ 1/T dQ} i wouldn't mind seeing the connection here. perhaps glen or Clay (who are the other physikers hanging here?) might have an answer.
> Or, you > could model the entire universe as a computer, and > information theory applies. It even describes human > nature - manipulating symbols is what separates us > from the apes.
oh, i dunno. that sounds like the kinda stuff we talk about as we sit around picking our noses. apes don't manipulate signals? -- r b-j rbj@audioimagination.com "Imagination is more important than knowledge."
On Nov 1, 2:13=A0pm, jim <retnuh2...@gmail.com> wrote:
> On Nov 1, 1:29=A0pm, RichD <r_delaney2...@yahoo.com> wrote: > > > > > > > > > > > On Nov 1, Chris <chris.sant...@gmail.com> wrote: > > > > Do you have a specific question? Or just want to lurk and learn? > > > I'd like to lurk someplace, where there are discussions > > of current topics. =A0It's such a rich field, there are always > > new ideas. > > > > Information Theory was the hardest class I took in grad school. So > > > abstract, and so much probability.... but it's a cool topic > > > nonetheless. > > > I read Shannon's paper long go, and found it fascinating > > ever since. =A0In the most general sense, it's applicable > > to so many areas. =A0For instance, people have recast > > thermodynamics as information processing. =A0Or, you > > could model the entire universe as a computer, and > > information theory applies. > > =A0 =A0Well, you can. But it's still going to take the people > =A0 =A0working in thermodynamics another several hundred > =A0 =A0years to discover that when base all your science > =A0 =A0on Classical Geometry, you will always come > =A0 =A0to that conclusion, regardless of the technology.
Which is mostly a way of telling the loons: If you could get the moon on classical geometry, they wouldn't be paying real engineers, several millions dollars a year, and paying people working in thermodynamics, minimum wage for waxing Edsels for Dow Chemical.
> > =A0 =A0Which still doesn't imply that the people who > =A0 =A0understand number theory aren't going to be > =A0 =A0working on post pyramid compilers, rather than > =A0 =A0more sandstone libraries for them. > > =A0 =A0The people who work in engineering will be > =A0 =A0working on lasers, holographics, nanotechnolgy, and self- > replicating machines. > =A0 =A0Regardless of what the jerks do with flourescent lightbulbs. > > =A0 =A0And the people who working in signal processing will be working > =A0 =A0on 21st Helicopters, regardless of what any of the clowns do with > radar. > > =A0It even describes human > > > > > > > > > nature - manipulating symbols is what separates us > > from the apes. > > > > > >Are there any boards, or private blogs, dedicated > > > > >to discussion of information theory? > > > -- > > Rich
On 11/1/2011 4:04 PM, robert bristow-johnson wrote:

   ...

> oh, i dunno. that sounds like the kinda stuff we talk about as we sit > around picking our noses. apes don't manipulate signals?
Even my dog manipulates symbols, sometimes in very creative ways. There was an orangutan who regularly unlocked his cage at night, letting himself and his family into the savanna-like yard outside it. He used a piece of wire as a lock pick, and rebent it to fit behind his lip when he didn't need to use it. It took the keepers a long time to figure it all out. Can we in good conscience lock up creatures like that? Jerry -- Engineering is the art of making what you want from things you can get.
robert bristow-johnson <rbj@audioimagination.com> wrote:
(snip)

> even though i have a *feel* for the connection to thermodynamic entropy, > i really do not understand what the mathematical connection is between
> SUM{ p{M_i} * I{M_i} } > i
> where M_i = the i_th message > p{M_i} is the probability of the i_th message > I{M_i} is the information content of the i_th message and is equal to
> -log( p{M_i} )
> (the base of the log only changes the units that information is measured > with. log2() means information measured in "bits".)
> now how is that above summation related to thermodynamic entropy which is
> integral{ 1/T dQ}
> i wouldn't mind seeing the connection here. perhaps glen or > Clay (who are the other physikers hanging here?) might have > an answer.
I am not sure I can explain it so well, either. Thermodynamic entropy is related to the number of states a system can be in, which I believe can trace back to the above sum. If you take a container with a partition down the middle, put one gas in one side, and a different on in the other side, then consider if you select a molecule at random, what the probability of it being gas one or gas two. For each side, there is 100% chance of it being the appropriate type. Now remove the partition and let them start mixing. The entropy increases as they mix, and the probability eventually approaches 50% each. In the case of individual molecules, it is a sum. With enough, you approximate it as an integral. -- glen
On 11/1/2011 12:29 PM, RichD wrote:

> I read Shannon's paper long go, and found it fascinating > ever since. In the most general sense, it's applicable > to so many areas. For instance, people have recast > thermodynamics as information processing. Or, you > could model the entire universe as a computer, and > information theory applies. It even describes human > nature - manipulating symbols is what separates us > from the apes.
Actually, Shannon's "paper" is a book. And the unfortunate part is that physicists in particular for the most part have developed little understanding of Shannon's association of information with entropy. This one fact alone blows a huge hole in classical physics. Yet, (nearly) all physicists go blindly on barfing the sad dogma of how entropy always increases, how the universe is "running down" etc. etc. They know even less about the relationships between signal processing and Electromagnetic theory. They can't understand the transform relationships between the amplitude of a wave in free space at one point and it's far field angular expansion. They are snowed by self-inductance because they haven't bothered to understand the nature of feedback systems. In short they are stuck in ignorance and the past and have NO desire to move out of it. But the unfortunate facts are that information theory especially combined with things like classical thermodynamics holds the key to physics in the 21st century! And what exactly IS physics going to be in the 21st century? I can tell you. It will be the physics of the things that physics of today has so LOUDLY avoided. Namely, the physics of life! The grass under my brick sidewalk is ALWAYS striving to take the chaos of the dirt down there and turn it into ORDER! It insists on doing this every year in spite of my best efforts to stop it! But still "traditional" science pretends to move "forward" while ignoring the fact that Shannon shot all their dogma to hell back in the MIDDLE of the the 20th century.