Hi, could anybody to provide me with a reference to a correct Shenon's theorem proving? Thank you.
Shenon's theorem proving
Started by ●April 4, 2010
Reply by ●April 4, 20102010-04-04
On Apr 4, 4:04�am, "st256" <st256@n_o_s_p_a_m.mail.ru> wrote:> Hi, > > could anybody to provide me with a reference to a correct Shenon's theorem > proving?"Shenon"? might you mean "Shannon"? if so, which one? the Sampling theorem (often associated with other names like Nyquist or Whittaker or Kotelnikov)? the information content of a message? the channel information capacity theorem (Shannon-Hartley)? dunno about the rooskie version of Wikipedia, but the english version has proofs, i think. r b-j
Reply by ●April 5, 20102010-04-05
On Apr 4, 11:44�am, robert bristow-johnson <r...@audioimagination.com> wrote:> On Apr 4, 4:04�am, "st256" <st256@n_o_s_p_a_m.mail.ru> wrote: > > > Hi, > > > could anybody to provide me with a reference to a correct Shenon's theorem > > proving? > > "Shenon"? > > might you mean "Shannon"? �if so, which one? �the Sampling theorem > (often associated with other names like Nyquist or Whittaker or > Kotelnikov)? �the information content of a message? �the channel > information capacity theorem (Shannon-Hartley)? > > dunno about the rooskie version of Wikipedia, but the english version > has proofs, i think. > > r b-jShannon's paper "A Mathematical Theory of Communication" used to be on the web for free. Clay
Reply by ●April 5, 20102010-04-05
On Apr 5, 10:09�am, Clay <c...@claysturner.com> wrote:> > Shannon's paper "A Mathematical Theory of Communication" used to be on > the web for free. >But the OP wanted a *correct* proof of Shannon's noisy channel coding theorem (or so I think), and Shannon's proof does not cross all the i's and dot all the t's to a mathematician's satisfaction.
Reply by ●April 5, 20102010-04-05
On Apr 5, 4:05�pm, dvsarwate <dvsarw...@gmail.com> wrote:> On Apr 5, 10:09�am, Clay <c...@claysturner.com> wrote: > > > > > Shannon's paper "A Mathematical Theory of Communication" used to be on > > the web for free. > > But the OP wanted a *correct* proof of Shannon's > noisy channel coding theorem (or so I think), and > Shannon's proof does not cross all the i's and dot > all the t's to a mathematician's satisfaction.A rigourous proof of the noisy coding theorem may be found starting on page 107 of Coding and Information Theory by Steven Roman (1992) Springer Verlag. The aformentioned book also includes details on the noiseless coding theorem as well. Clay
Reply by ●April 5, 20102010-04-05
On 4/5/2010 10:01 PM, Clay wrote:> > A rigourous proof of the noisy coding theorem > > Clay > >Hmm. If it is provable, it is no longer a theorem. It becomes true. HTH., Syms.
Reply by ●April 5, 20102010-04-05
Symon <symon_brewer@hotmail.com> wrote:>On 4/5/2010 10:01 PM, Clay wrote:>> A rigourous proof of the noisy coding theorem>Hmm. If it is provable, it is no longer a theorem. It becomes true.You may be confusing theorem and theory, but I can't prove it. Steve
Reply by ●April 6, 20102010-04-06
> >"Shenon"? > >might you mean "Shannon"? if so, which one? the Sampling theorem >(often associated with other names like Nyquist or Whittaker or >Kotelnikov)? the information content of a message? the channel >information capacity theorem (Shannon-Hartley)? > >dunno about the rooskie version of Wikipedia, but the english version >has proofs, i think. > >r b-j >Thank you very much! Of course, I meaned "Shannon", sorry! There is the authentic Shannon's proof in Wikipedia indeed. I look for a correct proof of sampling theorem because authentic proofs sometimes are not correct one enough.
Reply by ●April 6, 20102010-04-06
st256 wrote:>> >> "Shenon"? >> >> might you mean "Shannon"? if so, which one? the Sampling theorem >> (often associated with other names like Nyquist or Whittaker or >> Kotelnikov)? the information content of a message? the channel >> information capacity theorem (Shannon-Hartley)? >> >> dunno about the rooskie version of Wikipedia, but the english version >> has proofs, i think. >> >> r b-j >> > > Thank you very much! > > Of course, I meaned "Shannon", sorry! > There is the authentic Shannon's proof in Wikipedia indeed. I look for a > correct proof of sampling theorem because authentic proofs sometimes are > not correct one enough.The Wikipedia one is fine. -- Les Cargill
Reply by ●April 7, 20102010-04-07
On Apr 6, 8:03�pm, Les Cargill <lcargil...@comcast.net> wrote:> st256 wrote: > > >> "Shenon"? > > >> might you mean "Shannon"? �if so, which one? �the Sampling theorem > >> (often associated with other names like Nyquist or Whittaker or > >> Kotelnikov)? �the information content of a message? �the channel > >> information capacity theorem (Shannon-Hartley)? > > >> dunno about the rooskie version of Wikipedia, but the english version > >> has proofs, i think. > > > There is the authentic Shannon's proof in Wikipedia indeed. I look for a > > correct proof of sampling theorem because authentic proofs sometimes are > > not correct one enough. > > The Wikipedia one is fine. >well, it all depends on how anal one wants to get regarding the dirac impulse and dirac comb. if you want to make your math prof happy, perhaps http://en.wikipedia.org/wiki/Poisson_summation_formula will be more rigorous. but i am comfortable with the "engineering" definition and usage of the dirac delta function, so the proof in Nyquist-Shannon sampling theorem is good enough for me. r b-j






