On Jul 29, 10:27�am, Verictor <stehu...@gmail.com> wrote:
> On Jul 28, 9:37�am, "Sylvia" <sylvia.za...@gmail.com> wrote:
>
> > Hello,
>
> > I want to have the analytical continous form of entropy (for an image X)
> > and its derivative (at each location in image X) that can be directly
> > implemented. Is there any good reference for this?
>
> > Thanks
>
> > Sylvia
>
> Some books/chapters on entropy:
>
> Thomas Cover: Elements of Information Theory
> Papoulis: Probability, random variables and stochastic process
>
> But I don't recall either of them would have entropy's derivative
> described. What do you want to do here?
Also look at
Blahut: Principles and Practice of Information Theory
He discusses entropy and it's derivative. You will need to link
entropy, entropy inequality, and vector entropy inequality.
Maurice Givens
Reply by Verictor●July 29, 20092009-07-29
On Jul 28, 9:37�am, "Sylvia" <sylvia.za...@gmail.com> wrote:
> Hello,
>
> I want to have the analytical continous form of entropy (for an image X)
> and its derivative (at each location in image X) that can be directly
> implemented. Is there any good reference for this?
>
> Thanks
>
> Sylvia
Some books/chapters on entropy:
Thomas Cover: Elements of Information Theory
Papoulis: Probability, random variables and stochastic process
But I don't recall either of them would have entropy's derivative
described. What do you want to do here?
Reply by Clay●July 28, 20092009-07-28
On Jul 28, 11:37=A0am, "Sylvia" <sylvia.za...@gmail.com> wrote:
> Hello,
>
> I want to have the analytical continous form of entropy (for an image X)
> and its derivative (at each location in image X) that can be directly
> implemented. Is there any good reference for this?
>
> Thanks
>
> Sylvia
Hello Sylvia,
There are entire books written on this. Assuming you want entropy in
the information theory sense and not as in thermodynamics (yes they
are related - look at the Gibb's formula). But assuming information
theory is what you want, the seminal article is by Claude Shannon - "A
Mathematical Theory of Communications." The Wiki article on info
theory should be able to get you going on this.
IHTH,
Clay
Reply by Sylvia●July 28, 20092009-07-28
Hello,
I want to have the analytical continous form of entropy (for an image X)
and its derivative (at each location in image X) that can be directly
implemented. Is there any good reference for this?
Thanks
Sylvia