DSPRelated.com
Forums

Shenon's theorem proving

Started by st256 April 4, 2010
Hi,

could anybody to provide me with a reference to a correct Shenon's theorem
proving?

Thank you.
On Apr 4, 4:04&#4294967295;am, "st256" <st256@n_o_s_p_a_m.mail.ru> wrote:
> Hi, > > could anybody to provide me with a reference to a correct Shenon's theorem > proving?
"Shenon"? might you mean "Shannon"? if so, which one? the Sampling theorem (often associated with other names like Nyquist or Whittaker or Kotelnikov)? the information content of a message? the channel information capacity theorem (Shannon-Hartley)? dunno about the rooskie version of Wikipedia, but the english version has proofs, i think. r b-j
On Apr 4, 11:44&#4294967295;am, robert bristow-johnson <r...@audioimagination.com>
wrote:
> On Apr 4, 4:04&#4294967295;am, "st256" <st256@n_o_s_p_a_m.mail.ru> wrote: > > > Hi, > > > could anybody to provide me with a reference to a correct Shenon's theorem > > proving? > > "Shenon"? > > might you mean "Shannon"? &#4294967295;if so, which one? &#4294967295;the Sampling theorem > (often associated with other names like Nyquist or Whittaker or > Kotelnikov)? &#4294967295;the information content of a message? &#4294967295;the channel > information capacity theorem (Shannon-Hartley)? > > dunno about the rooskie version of Wikipedia, but the english version > has proofs, i think. > > r b-j
Shannon's paper "A Mathematical Theory of Communication" used to be on the web for free. Clay
On Apr 5, 10:09&#4294967295;am, Clay <c...@claysturner.com> wrote:
> > Shannon's paper "A Mathematical Theory of Communication" used to be on > the web for free. >
But the OP wanted a *correct* proof of Shannon's noisy channel coding theorem (or so I think), and Shannon's proof does not cross all the i's and dot all the t's to a mathematician's satisfaction.
On Apr 5, 4:05&#4294967295;pm, dvsarwate <dvsarw...@gmail.com> wrote:
> On Apr 5, 10:09&#4294967295;am, Clay <c...@claysturner.com> wrote: > > > > > Shannon's paper "A Mathematical Theory of Communication" used to be on > > the web for free. > > But the OP wanted a *correct* proof of Shannon's > noisy channel coding theorem (or so I think), and > Shannon's proof does not cross all the i's and dot > all the t's to a mathematician's satisfaction.
A rigourous proof of the noisy coding theorem may be found starting on page 107 of Coding and Information Theory by Steven Roman (1992) Springer Verlag. The aformentioned book also includes details on the noiseless coding theorem as well. Clay
On 4/5/2010 10:01 PM, Clay wrote:
> > A rigourous proof of the noisy coding theorem > > Clay > >
Hmm. If it is provable, it is no longer a theorem. It becomes true. HTH., Syms.
Symon  <symon_brewer@hotmail.com> wrote:

>On 4/5/2010 10:01 PM, Clay wrote:
>> A rigourous proof of the noisy coding theorem
>Hmm. If it is provable, it is no longer a theorem. It becomes true.
You may be confusing theorem and theory, but I can't prove it. Steve
> >"Shenon"? > >might you mean "Shannon"? if so, which one? the Sampling theorem >(often associated with other names like Nyquist or Whittaker or >Kotelnikov)? the information content of a message? the channel >information capacity theorem (Shannon-Hartley)? > >dunno about the rooskie version of Wikipedia, but the english version >has proofs, i think. > >r b-j >
Thank you very much! Of course, I meaned "Shannon", sorry! There is the authentic Shannon's proof in Wikipedia indeed. I look for a correct proof of sampling theorem because authentic proofs sometimes are not correct one enough.
st256 wrote:
>> >> "Shenon"? >> >> might you mean "Shannon"? if so, which one? the Sampling theorem >> (often associated with other names like Nyquist or Whittaker or >> Kotelnikov)? the information content of a message? the channel >> information capacity theorem (Shannon-Hartley)? >> >> dunno about the rooskie version of Wikipedia, but the english version >> has proofs, i think. >> >> r b-j >> > > Thank you very much! > > Of course, I meaned "Shannon", sorry! > There is the authentic Shannon's proof in Wikipedia indeed. I look for a > correct proof of sampling theorem because authentic proofs sometimes are > not correct one enough.
The Wikipedia one is fine. -- Les Cargill
On Apr 6, 8:03&#4294967295;pm, Les Cargill <lcargil...@comcast.net> wrote:
> st256 wrote: > > >> "Shenon"? > > >> might you mean "Shannon"? &#4294967295;if so, which one? &#4294967295;the Sampling theorem > >> (often associated with other names like Nyquist or Whittaker or > >> Kotelnikov)? &#4294967295;the information content of a message? &#4294967295;the channel > >> information capacity theorem (Shannon-Hartley)? > > >> dunno about the rooskie version of Wikipedia, but the english version > >> has proofs, i think. > > > There is the authentic Shannon's proof in Wikipedia indeed. I look for a > > correct proof of sampling theorem because authentic proofs sometimes are > > not correct one enough. > > The Wikipedia one is fine. >
well, it all depends on how anal one wants to get regarding the dirac impulse and dirac comb. if you want to make your math prof happy, perhaps http://en.wikipedia.org/wiki/Poisson_summation_formula will be more rigorous. but i am comfortable with the "engineering" definition and usage of the dirac delta function, so the proof in Nyquist-Shannon sampling theorem is good enough for me. r b-j