>On Mon, 27 Jul 2015 02:16:31 +0000, Steve Pope wrote:
>
>> I'm not buying this.
>That's because I mis-stated. log2(8/3) is remarkably close to sqrt(2).
>See my correction to my original pointlessPost(TM).
You're absolutely right.
Maybe this can be massaged into a numerological explanation of
the fine structure constant.
Steve
Reply by Tim Wescott●July 27, 20152015-07-27
On Sun, 26 Jul 2015 18:25:41 -0500, Tim Wescott wrote:
> On Sun, 26 Jul 2015 22:48:03 +0000, glen herrmannsfeldt wrote:
>
>> Piergiorgio Sartor
>> <piergiorgio.sartor.this.should.not.be.used@nexgo.removethis.de> wrote:
>>> On 2015-07-26 09:12, joshipura@gmail.com wrote:
>>
>> (snip)
>>>> I am coming across a situation in which the entropy of an
>>>> (unequiprobable) event is coming out to be GREATER than had it been
>>>> equiprobable. I am clearly wrong somewhere. Where?
>>
>> (snip)
>>
>>> Maybe I got the math completely wrong, but:
>>
>>> -(log2(6/16)*(6/16)+2*log2(4/16)*(4/16)+2*log2(1/16)*(1/16)) = 2.0306
>>
>>> So, it seems lower than -log2(1/5), while still greater than 2.
>>
>> The equiprobable one is 2.322, and 2.0306 is lower, as you said it
>> should be.
>>
>> For those who have calculators without a log2 button:
>>
>> -(6*ln(6)-8*ln(4)-16*ln(16))/16/ln(2) = 2.0306, as above.
>>
>> It took me a few tries to get that, but it is just as it should be.
>>
>> -- glen
>
> Aw c'mon. You calculate the log2(1/4) and log2(1/16) in your head:
>
> 2 * 1/16 * log2(16) + 2 * 1/4 * log2(4) + 3/8 * log2(3/8) =
>
> 1/2 + 1 + 3/8 * ln(8/3)/ln(2) = the same number the rest of us are
> getting.
>
> (I'm sure it's totally by chance, but ln(8/3) is remarkably close to
> sqrt (2).)
Oops -- log2(8/3) is remarkably close to sqrt(2) -- like, within 600PPM.
(8/3)^sqrt(2) = 4.0032, to three decimal places.
So, nothing at all profound, just a nice little coincidence to trip over.
--
Tim Wescott
Wescott Design Services
http://www.wescottdesign.com
Reply by Tim Wescott●July 27, 20152015-07-27
On Mon, 27 Jul 2015 02:16:31 +0000, Steve Pope wrote:
> Tim Wescott <seemywebsite@myfooter.really> wrote:
>
>>(I'm sure it's totally by chance, but ln(8/3) is remarkably close to
>>sqrt (2).)
>
> I'm not buying this.
>
> Steve
That's because I mis-stated. log2(8/3) is remarkably close to sqrt(2).
See my correction to my original pointlessPost(TM).
--
Tim Wescott
Wescott Design Services
http://www.wescottdesign.com
Reply by Steve Pope●July 26, 20152015-07-26
Tim Wescott <seemywebsite@myfooter.really> wrote:
>(I'm sure it's totally by chance, but ln(8/3) is remarkably close to sqrt
>(2).)
I'm not buying this.
Steve
Reply by Tim Wescott●July 26, 20152015-07-26
On Sun, 26 Jul 2015 22:48:03 +0000, glen herrmannsfeldt wrote:
> Piergiorgio Sartor
> <piergiorgio.sartor.this.should.not.be.used@nexgo.removethis.de> wrote:
>> On 2015-07-26 09:12, joshipura@gmail.com wrote:
>
> (snip)
>>> I am coming across a situation in which the entropy of an
>>> (unequiprobable) event is coming out to be GREATER than had it been
>>> equiprobable. I am clearly wrong somewhere. Where?
>
> (snip)
>
>> Maybe I got the math completely wrong, but:
>
>> -(log2(6/16)*(6/16)+2*log2(4/16)*(4/16)+2*log2(1/16)*(1/16)) = 2.0306
>
>> So, it seems lower than -log2(1/5), while still greater than 2.
>
> The equiprobable one is 2.322, and 2.0306 is lower, as you said it
> should be.
>
> For those who have calculators without a log2 button:
>
> -(6*ln(6)-8*ln(4)-16*ln(16))/16/ln(2) = 2.0306, as above.
>
> It took me a few tries to get that, but it is just as it should be.
>
> -- glen
Aw c'mon. You calculate the log2(1/4) and log2(1/16) in your head:
2 * 1/16 * log2(16) + 2 * 1/4 * log2(4) + 3/8 * log2(3/8) =
1/2 + 1 + 3/8 * ln(8/3)/ln(2) = the same number the rest of us are
getting.
(I'm sure it's totally by chance, but ln(8/3) is remarkably close to sqrt
(2).)
--
Tim Wescott
Wescott Design Services
http://www.wescottdesign.com
Reply by glen herrmannsfeldt●July 26, 20152015-07-26
>> I am coming across a situation in which the entropy of an
>> (unequiprobable) event is coming out to be GREATER than had it
>> been equiprobable. I am clearly wrong somewhere. Where?
> So, it seems lower than -log2(1/5), while still greater than 2.
The equiprobable one is 2.322, and 2.0306 is lower, as you said
it should be.
For those who have calculators without a log2 button:
-(6*ln(6)-8*ln(4)-16*ln(16))/16/ln(2) = 2.0306, as above.
It took me a few tries to get that, but it is just as it should be.
-- glen
Reply by Tim Wescott●July 26, 20152015-07-26
On Sun, 26 Jul 2015 09:34:39 +0200, Piergiorgio Sartor wrote:
> On 2015-07-26 09:12, joshipura@gmail.com wrote:
>> All,
>> I used to be a student of channel capacity years back.
>> Now I am trying to teach concepts of information, entropy and channel
>> capacity to my high-school kid for a school math project.
>>
>> I am coming across a situation in which the entropy of an
>> (unequiprobable) event is coming out to be GREATER than had it been
>> equiprobable. I am clearly wrong somewhere. Where?
>>
>> Background: (refer to https://en.wikipedia.org/wiki/Ashte_kashte for
>> more details)
>> Toss four coins together. Number of obverse sides is the "score". That
>> means five scores are possible - 1, 2, 3, 4 and 0. The 0 is considered
>> as "8" in the game.
>>
>> Problem statement: What is the entropy of this scheme?
>>
>> My assumption: In my understanding, the problem definition has 5 events
>> and not 16. So we are trying to calculate entropy for 5, not 16.
>>
>> My calculation: Events are not equiprobable.
>> Getting 2 is highest probable (6/16), 1 and 3 next (4/16) and the least
>> is getting a 4 or 0 (that is, 8) (1/16). Going by the formula, the
>> entropy (H1) comes out to be 2.78063906. Which kind of matches because
>> we require <3 bits to communicate the outcome of a throw.
>
> Maybe I got the math completely wrong, but:
>
> -(log2(6/16)*(6/16)+2*log2(4/16)*(4/16)+2*log2(1/16)*(1/16)) = 2.0306
>
> So, it seems lower than -log2(1/5), while still greater than 2.
>
> Am I missing something?
> Maybe my computation is not correct?
I got the same results, without first looking at Piergiorgio's result.
--
Tim Wescott
Wescott Design Services
http://www.wescottdesign.com
Reply by Piergiorgio Sartor●July 26, 20152015-07-26
On 2015-07-26 09:12, joshipura@gmail.com wrote:
> All,
> I used to be a student of channel capacity years back.
> Now I am trying to teach concepts of information, entropy and channel capacity to my high-school kid for a school math project.
>
> I am coming across a situation in which the entropy of an (unequiprobable) event is coming out to be GREATER than had it been equiprobable. I am clearly wrong somewhere. Where?
>
> Background: (refer to https://en.wikipedia.org/wiki/Ashte_kashte for more details)
> Toss four coins together. Number of obverse sides is the "score". That means five scores are possible - 1, 2, 3, 4 and 0. The 0 is considered as "8" in the game.
>
> Problem statement: What is the entropy of this scheme?
>
> My assumption: In my understanding, the problem definition has 5 events and not 16. So we are trying to calculate entropy for 5, not 16.
>
> My calculation: Events are not equiprobable.
> Getting 2 is highest probable (6/16), 1 and 3 next (4/16) and the least is getting a 4 or 0 (that is, 8) (1/16). Going by the formula, the entropy (H1) comes out to be 2.78063906. Which kind of matches because we require <3 bits to communicate the outcome of a throw.
Maybe I got the math completely wrong, but:
-(log2(6/16)*(6/16)+2*log2(4/16)*(4/16)+2*log2(1/16)*(1/16)) = 2.0306
So, it seems lower than -log2(1/5), while still greater than 2.
Am I missing something?
Maybe my computation is not correct?
bye,
pg
> My confusion: So, if we consider all five to be equiprobable events, the entropy (H2) becomes log 5 = 2.3...
>
> My question: Shouldn't entropy (H2) be maximum with equiprobable events? Then why is H1 turning out to be higher?
>
> Thannks in advance,
> -Bhushit
>
--
piergiorgio
Reply by ●July 26, 20152015-07-26
All,
I used to be a student of channel capacity years back.
Now I am trying to teach concepts of information, entropy and channel capacity to my high-school kid for a school math project.
I am coming across a situation in which the entropy of an (unequiprobable) event is coming out to be GREATER than had it been equiprobable. I am clearly wrong somewhere. Where?
Background: (refer to https://en.wikipedia.org/wiki/Ashte_kashte for more details)
Toss four coins together. Number of obverse sides is the "score". That means five scores are possible - 1, 2, 3, 4 and 0. The 0 is considered as "8" in the game.
Problem statement: What is the entropy of this scheme?
My assumption: In my understanding, the problem definition has 5 events and not 16. So we are trying to calculate entropy for 5, not 16.
My calculation: Events are not equiprobable.
Getting 2 is highest probable (6/16), 1 and 3 next (4/16) and the least is getting a 4 or 0 (that is, 8) (1/16). Going by the formula, the entropy (H1) comes out to be 2.78063906. Which kind of matches because we require <3 bits to communicate the outcome of a throw.
My confusion: So, if we consider all five to be equiprobable events, the entropy (H2) becomes log 5 = 2.3...
My question: Shouldn't entropy (H2) be maximum with equiprobable events? Then why is H1 turning out to be higher?
Thannks in advance,
-Bhushit