DSPRelated.com
Forums

turbo, convolution codes, rate matching

Started by manishp June 2, 2013
Sirs,
I have few questions on turbo codes ...

Followed by turbo coding is the rate matching stage where encoded data are
dropped.
Now, since decoder operates like machine, will dropping of data not break
the decoder algorithm?

Another related question, what does convolution signify in case of
convolution coding?

Thanks, manish
On Sun, 02 Jun 2013 06:11:50 -0500, "manishp" <58525@dsprelated>
wrote:

>Sirs, >I have few questions on turbo codes ... > >Followed by turbo coding is the rate matching stage where encoded data are >dropped. >Now, since decoder operates like machine, will dropping of data not break >the decoder algorithm?
No, the "dropped", or more correctly, "punctured" bits are replaced with zeros in the decoder. This allows the degradation in performance due to the puncturing to be traded for the bandwidth taken up by those bits. Since the systematic bits are not punctured, only the parity bits are affected, which minimizes the performance degradation.
>Another related question, what does convolution signify in case of >convolution coding? > >Thanks, manish
The encoder performs a binary convolution. Eric Jacobsen Anchor Hill Communications http://www.anchorhill.com
On Jun 2, 9eric.jacob...@ieee.org (Eric Jacobsen) wrote:
> >Followed by turbo coding is the rate matching stage where encoded data are > >dropped. > >Now, since decoder operates like machine, will dropping of data not break > >the decoder algorithm? > > No, the "dropped", or more correctly, "punctured" bits are replaced > with zeros in the decoder. &#4294967295; This allows the degradation in > performance due to the puncturing to be traded for the bandwidth taken > up by those bits. &#4294967295; Since the systematic bits are not punctured, only > the parity bits are affected, which minimizes the performance > degradation.
Can you recommend a tutorial paper on turbo coding? Along with its history - Thanks -- Rich
On 6/4/2013 1:36 PM, RichD wrote:
> On Jun 2, 9eric.jacob...@ieee.org (Eric Jacobsen) wrote: >>> Followed by turbo coding is the rate matching stage where encoded data are >>> dropped. >>> Now, since decoder operates like machine, will dropping of data not break >>> the decoder algorithm? >> >> No, the "dropped", or more correctly, "punctured" bits are replaced >> with zeros in the decoder. This allows the degradation in >> performance due to the puncturing to be traded for the bandwidth taken >> up by those bits. Since the systematic bits are not punctured, only >> the parity bits are affected, which minimizes the performance >> degradation. > > Can you recommend a tutorial paper on turbo coding? > Along with its history -
Christian Schlegel "Trellis and Turbo Coding" Good introductory book. //--- Q: Why it is impossible to have sex in Red Square in Moscow ? A: Because every bystander idiot would be trying to give his invaluable advice. Vladimir Vassilevsky DSP and Mixed Signal Designs www.abvolt.com
On 6/2/2013 11:39 AM, Eric Jacobsen wrote:

> No, the "dropped", or more correctly, "punctured" bits are replaced > with zeros in the decoder. This allows the degradation in > performance due to the puncturing to be traded for the bandwidth taken > up by those bits. Since the systematic bits are not punctured, only > the parity bits are affected, which minimizes the performance > degradation.
Each bit is no more and no less then any other bit. Not puncturing systematic part is just mere convenience. VLV
On Tue, 04 Jun 2013 14:11:28 -0500, Vladimir Vassilevsky
<nospam@nowhere.com> wrote:

>On 6/2/2013 11:39 AM, Eric Jacobsen wrote: > >> No, the "dropped", or more correctly, "punctured" bits are replaced >> with zeros in the decoder. This allows the degradation in >> performance due to the puncturing to be traded for the bandwidth taken >> up by those bits. Since the systematic bits are not punctured, only >> the parity bits are affected, which minimizes the performance >> degradation. > >Each bit is no more and no less then any other bit. >Not puncturing systematic part is just mere convenience.
I think you don't understand Turbo Codes. Eric Jacobsen Anchor Hill Communications http://www.anchorhill.com
On 6/4/2013 3:31 PM, Eric Jacobsen wrote:
> On Tue, 04 Jun 2013 14:11:28 -0500, Vladimir Vassilevsky > <nospam@nowhere.com> wrote: > >> On 6/2/2013 11:39 AM, Eric Jacobsen wrote: >> >>> No, the "dropped", or more correctly, "punctured" bits are replaced >>> with zeros in the decoder. This allows the degradation in >>> performance due to the puncturing to be traded for the bandwidth taken >>> up by those bits. Since the systematic bits are not punctured, only >>> the parity bits are affected, which minimizes the performance >>> degradation. >> >> Each bit is no more and no less then any other bit. >> Not puncturing systematic part is just mere convenience. > > I think you don't understand Turbo Codes. >
Haha, 2 x 2 = 4. Jacobsen = pumped up windbag.
On Tue, 4 Jun 2013 11:36:08 -0700 (PDT), RichD
<r_delaney2001@yahoo.com> wrote:

>On Jun 2, 9eric.jacob...@ieee.org (Eric Jacobsen) wrote: >> >Followed by turbo coding is the rate matching stage where encoded data a= >re >> >dropped. >> >Now, since decoder operates like machine, will dropping of data not brea= >k >> >the decoder algorithm? >> >> No, the "dropped", or more correctly, "punctured" bits are replaced >> with zeros in the decoder. =A0 This allows the degradation in >> performance due to the puncturing to be traded for the bandwidth taken >> up by those bits. =A0 Since the systematic bits are not punctured, only >> the parity bits are affected, which minimizes the performance >> degradation. > >Can you recommend a tutorial paper on turbo coding?
A web search on Turbo Code Tutorial provides a lot of stuff to look through, and Dr. Ryan's early effort is still pretty good: http://shannon.ece.ufl.edu/eel6550/lit/turbo2c.pdf There are probably better (i.e., more readable) copies out there somewhere. This one is also pretty good: http://www.coe.montana.edu/ee/rwolff/ee548/papers/turbocodes/turbocodestutoriali-codestructures.pdf Both are fairly old treatments and deal with the basic structures of convolutional Turbo Codes. Those basic structures haven't changed much, but a few new twists have been added over the years.
>Along with its history -
That's a harder one. The original paper from Berrou: C. Berrou, A. Glavieux, P. Thitimajshima, "Near Shannon limit error-correcting coding and decoding: Turbo Codes", ICC93, vol. 2, pp. 1064-1070, May 93. It's not easy reading, it's a just a historical note as the paper that started everything for Turbo Codes. There was a flurry of publications in the late 90s and lot of standardization in the 2000s. Berrou wrote this later: C. Berrou, "The ten-year-old turbo codes are entering into service", IEEE Communications Magazine, vol. 41, pp. 110-116, August 03. I seem to recall Viterbi or Gallager or somebody writing a FEC history article in a Comm or IT publication a while back, but I can't recall the details and can't find it. :(
> >Thanks > >-- >Rich
Eric Jacobsen Anchor Hill Communications http://www.anchorhill.com
On Tue, 04 Jun 2013 15:44:09 -0500, Vladimir Vassilevsky
<nospam@nowhere.com> wrote:

>On 6/4/2013 3:31 PM, Eric Jacobsen wrote: >> On Tue, 04 Jun 2013 14:11:28 -0500, Vladimir Vassilevsky >> <nospam@nowhere.com> wrote: >> >>> On 6/2/2013 11:39 AM, Eric Jacobsen wrote: >>> >>>> No, the "dropped", or more correctly, "punctured" bits are replaced >>>> with zeros in the decoder. This allows the degradation in >>>> performance due to the puncturing to be traded for the bandwidth taken >>>> up by those bits. Since the systematic bits are not punctured, only >>>> the parity bits are affected, which minimizes the performance >>>> degradation. >>> >>> Each bit is no more and no less then any other bit. >>> Not puncturing systematic part is just mere convenience. >> >> I think you don't understand Turbo Codes. >> > >Haha, 2 x 2 = 4. Jacobsen = pumped up windbag.
Ah Vladimir, my friend. For the last year I have been wondering, ...why do you continue to treat Eric in such a way? It makes no sense to me. [-Rick-]
On Jun 5, 3:51&#4294967295;pm, Rick Lyons <R.Lyons@_BOGUS_ieee.org> wrote:
> On Tue, 04 Jun 2013 15:44:09 -0500, Vladimir Vassilevsky > > > > > > > > > > <nos...@nowhere.com> wrote: > >On 6/4/2013 3:31 PM, Eric Jacobsen wrote: > >> On Tue, 04 Jun 2013 14:11:28 -0500, Vladimir Vassilevsky > >> <nos...@nowhere.com> wrote: > > >>> On 6/2/2013 11:39 AM, Eric Jacobsen wrote: > > >>>> No, the "dropped", or more correctly, "punctured" bits are replaced > >>>> with zeros in the decoder. &#4294967295; This allows the degradation in > >>>> performance due to the puncturing to be traded for the bandwidth taken > >>>> up by those bits. &#4294967295; Since the systematic bits are not punctured, only > >>>> the parity bits are affected, which minimizes the performance > >>>> degradation. > > >>> Each bit is no more and no less then any other bit. > >>> Not puncturing systematic part is just mere convenience. > > >> I think you don't understand Turbo Codes. > > >Haha, 2 x 2 = 4. Jacobsen = pumped up windbag. > > Ah Vladimir, my friend. &#4294967295;For the last year I have > been wondering, ...why do you continue to treat > Eric in such a way? &#4294967295;It makes no sense to me. > > [-Rick-]
He is jealous of Eric's "Minister of Algorithms" title You can't imagine the number of times I wants to ream Eric out over this, but, then I restrain myself because I think "What if he is that good?"