On Monday, June 10, 2019 at 4:11:53 PM UTC-7, Eric Jacobsen wrote:
> On Mon, 10 Jun 2019 12:51:17 -0700 (PDT), RichD
> <r_delaney2001@yahoo.com> wrote:
> >Recently I attended a coding seminar. The speaker
> >very briefly reviewed the relevant history;
> >first Hamming codes, ~1960: BCH codes, ~1970: convolutional,
> >1995: polar, 2002: LDPC
> >Clue me in - what was the advance in each case?
> >I worked on a BCH project once, that seemed fairly
> >efficient, how is it, or other algorithms, deficient?
(snip)
> You'll probably find different histories out there from different
> sources, as the lineage of some of these aren't all that clear. It's
> like asking when the FFT algorithm was invented...you may get vastly
> different answers.
It seems to me that part of it is the ability to process such signals
at an appropriate speed.
My favorite in the history of transforms and coding is ICT,
the Integer Cosine Transform. It is similar to DCT, but the
coefficients are optimized for speeding up the forward transform,
at the expense of slowing down the inverse transform:
https://ntrs.nasa.gov/search.jsp?R=19940025116
Specifically, the forward transform is done on the CDP1802,
a microprocessor from the 8080 and 6502 days that doesn't have
hardware multiply.
It seems to me that the math for the more complicated coding
systems might have been around for years, but no ability to
use them until computing hardware caught up.
Reply by Gene Filatov●June 23, 20192019-06-23
On 23.06.2019 4:30, Steve Pope wrote:
> RichD <r_delaney2001@yahoo.com> wrote:
>
>> Can you recommend a reference on turbo codes, for self-study?
>
> You asked Eric but I'll chime in:
>
> Heegard and Wicker, _Turbo Coding_ . It is a no-nonsense reference,
> you can read it and in a couple days's time be able to write a
> simulation of a working turbo decoder.
>
>
> Steve
>
It's an excellent reference!
I would also suggest reading:
Todd K. Moon, _Error Correction Coding: Mathematical Methods and Algorithms_
Rolf Johannesson, Kamil Sh. Zigangirov, _Fundamentals of Convolutional
Coding_
Gene
Reply by Steve Pope●June 22, 20192019-06-22
RichD <r_delaney2001@yahoo.com> wrote:
>Can you recommend a reference on turbo codes, for self-study?
You asked Eric but I'll chime in:
Heegard and Wicker, _Turbo Coding_ . It is a no-nonsense reference,
you can read it and in a couple days's time be able to write a
simulation of a working turbo decoder.
Steve
Reply by RichD●June 22, 20192019-06-22
On June 12, Eric Jacobsen wrote:
>> Yes, my mistook, Turbo codes were invented in 1995.
>> What's the key idea there? My understanding is that
>> capacity requires exponentially long blocks, with
>> commensurate decoding times.
>
> That was one idea, but getting practical codes that worked close to
> capacity was always a bit of a Holy Grail. Turbo Codes were the
> first to demonstrate that it was possible, and since then a number of
> codes have been developed (e.g., TPCs, LDPCs, Polar, etc., etc.).
>
>>> a number of good historical surveys published in the late 90s after
>>> Turbo Codes were getting accepted as finally having reached the
>>> Holy Grail of reaching Capacity, since that was recognized as a
>>> historically significant step.
Can you recommend a reference on turbo codes, for self-study?
--
Rich
Reply by Eric Jacobsen●June 12, 20192019-06-12
On Mon, 10 Jun 2019 21:19:16 -0700 (PDT), RichD
<r_delaney2001@yahoo.com> wrote:
>On June 10, Eric Jacobsen wrote:
>>>Recently I attended a coding seminar. The speaker
>>>very briefly reviewed the relevant history;
>>>first Hamming codes, ~1960: BCH codes, ~1970: convolutional,
>>>1995: polar, 2002: LDPC
>>> what was the advance in each case?
>>
>> one of the biggest breakthroughs, Turbo Codes, aren't even
>> mentioned, and they were the first case where somebody actually
>> approached theoretical capacity with a practical code.
>
>Yes, my mistook, Turbo codes were invented in 1995.
>What's the key idea there? My understanding is that
>capacity requires exponentially long blocks, with
>commensurate decoding times.
That was one idea, but getting practical codes that worked close to
capacity was always a bit of a Holy Grail. Turbo Codes were the
first to demonstrate that it was possible, and since then a number of
codes have been developed (e.g., TPCs, LDPCs, Polar, etc., etc.).
>> LDPCs, on the other hand, were published in 1963 by Gallagher, but
>> weren't fully demonstrated to be capacity-approaching codes until
>> after Turbo Codes were published and there was renewed interest in
>> iterative decoding algorithms.
>
>Iterative decoding is also above my pay grade -
It's part of the secret sauce, along with soft-decision
message-passing, that make the capacity approaching codes work.
>> Complexity - If you can't build it or fit it in your system or the
>> circuit takes too much power to run, it isn't of much value. This
>> involves a judgement about various things, but it is an important
>> consideration in many cases.
>
>hmmmm... we're talking software, so if the digital
>hardware is given a priori, I don't see these considerations
>as substantial.
Implementation Complexity matters to software, too. If it can't be
computed within a week, it's not a great option, e.g. ;)
>> Also, different codes work for different types
>> of error distributions, e.g., block codes like BCH or Reed-Solomon
>> tend to be used when there are clumped errors, but convolutional codes
>> may be more desirable when the error rate is higher but the errors are
>> more randomly distributed.
>
>The convolutional codes are really challenging,
>with deep number theory roots, although the algorithms
>can be implemented in a recipe fashion... monkey see,
>monkey do...
>> a number of good historical surveys published in the late 90s after
>> Turbo Codes were getting accepted as finally having reached the
>> Holy Grail of reaching Capacity, since that was recognized as a
>> historically significant step.
>
>Something like a tutorial survey in a IEEE journal,
>"A history of coding theory"
>
>--
>Rich
Reply by Eric Jacobsen●June 12, 20192019-06-12
On Tue, 11 Jun 2019 23:11:05 +0000 (UTC), spope384@gmail.com (Steve
Pope) wrote:
>Eric Jacobsen <theman@ericjacobsen.org> wrote:
>
>>BTW, one of the biggest breakthroughs, Turbo Codes, aren't even
>>mentioned, and they were the first case where somebody actually
>>approached theoretical capacity with a practical code. Polar Codes
>>were kind of an offshoot of that (one of many) that came much later.
>>Turbo Codes were first published around 1994 and caused quite a flurry
>>of research as a result.
>>
>>LDPCs, on the other hand, were published in 1963 by Gallagher, but
>>weren't fully demonstrated to be capacity-approaching codes until
>>after Turbo Codes were published and there was renewed interest in
>>iterative decoding algorithms.
>
>Hi Eric,
>
>I knew you'd add to this thread. :-)
Couldn't stay away! ;)
>Before turbo codes were first published, a bit earlier in the 90's
>people were getting more and more mileage from iterative decoding of
>either concatenated codes or product codes. (Concatenated codes in the
>case of the Deep Space Network, and product codes in optical storage).
>
>Chris Heegard has a good exposition of the Deep Space Network work
>at that time, which was necessitated by a high-gain antenna on
>one of the space probes failing to deploy.
Divsalar, et al, had tons of stories from JPL on deep space coding,
including sending stuff out with encoders that they knew would be good
but that they didn't quite have a decoding algorithm for yet. They
figured they'd sort it out by the time the craft got far enough away
that they needed to use that code. That worked out. Pretty cool.
Reply by Steve Pope●June 11, 20192019-06-11
Eric Jacobsen <theman@ericjacobsen.org> wrote:
>BTW, one of the biggest breakthroughs, Turbo Codes, aren't even
>mentioned, and they were the first case where somebody actually
>approached theoretical capacity with a practical code. Polar Codes
>were kind of an offshoot of that (one of many) that came much later.
>Turbo Codes were first published around 1994 and caused quite a flurry
>of research as a result.
>
>LDPCs, on the other hand, were published in 1963 by Gallagher, but
>weren't fully demonstrated to be capacity-approaching codes until
>after Turbo Codes were published and there was renewed interest in
>iterative decoding algorithms.
Hi Eric,
I knew you'd add to this thread. :-)
Before turbo codes were first published, a bit earlier in the 90's
people were getting more and more mileage from iterative decoding of
either concatenated codes or product codes. (Concatenated codes in the
case of the Deep Space Network, and product codes in optical storage).
Chris Heegard has a good exposition of the Deep Space Network work
at that time, which was necessitated by a high-gain antenna on
one of the space probes failing to deploy.
Steve
Reply by RichD●June 11, 20192019-06-11
On June 10, Eric Jacobsen wrote:
>>Recently I attended a coding seminar. The speaker
>>very briefly reviewed the relevant history;
>>first Hamming codes, ~1960: BCH codes, ~1970: convolutional,
>>1995: polar, 2002: LDPC
>> what was the advance in each case?
>
> one of the biggest breakthroughs, Turbo Codes, aren't even
> mentioned, and they were the first case where somebody actually
> approached theoretical capacity with a practical code.
Yes, my mistook, Turbo codes were invented in 1995.
What's the key idea there? My understanding is that
capacity requires exponentially long blocks, with
commensurate decoding times.
> LDPCs, on the other hand, were published in 1963 by Gallagher, but
> weren't fully demonstrated to be capacity-approaching codes until
> after Turbo Codes were published and there was renewed interest in
> iterative decoding algorithms.
Iterative decoding is also above my pay grade -
> Complexity - If you can't build it or fit it in your system or the
> circuit takes too much power to run, it isn't of much value. This
> involves a judgement about various things, but it is an important
> consideration in many cases.
hmmmm... we're talking software, so if the digital
hardware is given a priori, I don't see these considerations
as substantial.
> Also, different codes work for different types
> of error distributions, e.g., block codes like BCH or Reed-Solomon
> tend to be used when there are clumped errors, but convolutional codes
> may be more desirable when the error rate is higher but the errors are
> more randomly distributed.
The convolutional codes are really challenging,
with deep number theory roots, although the algorithms
can be implemented in a recipe fashion... monkey see,
monkey do...
> a number of good historical surveys published in the late 90s after
> Turbo Codes were getting accepted as finally having reached the
> Holy Grail of reaching Capacity, since that was recognized as a
> historically significant step.
Something like a tutorial survey in a IEEE journal,
"A history of coding theory"
--
Rich
Reply by Eric Jacobsen●June 10, 20192019-06-10
On Mon, 10 Jun 2019 12:51:17 -0700 (PDT), RichD
<r_delaney2001@yahoo.com> wrote:
>Recently I attended a coding seminar. The speaker
>very briefly reviewed the relevant history;
>first Hamming codes, ~1960: BCH codes, ~1970: convolutional,
>1995: polar, 2002: LDPC
>
>Clue me in - what was the advance in each case?
>I worked on a BCH project once, that seemed fairly
>efficient, how is it, or other algorithms, deficient?
>
>
>--
>Rich
You'll probably find different histories out there from different
sources, as the lineage of some of these aren't all that clear. It's
like asking when the FFT algorithm was invented...you may get vastly
different answers.
BTW, one of the biggest breakthroughs, Turbo Codes, aren't even
mentioned, and they were the first case where somebody actually
approached theoretical capacity with a practical code. Polar Codes
were kind of an offshoot of that (one of many) that came much later.
Turbo Codes were first published around 1994 and caused quite a flurry
of research as a result.
LDPCs, on the other hand, were published in 1963 by Gallagher, but
weren't fully demonstrated to be capacity-approaching codes until
after Turbo Codes were published and there was renewed interest in
iterative decoding algorithms.
Usually the motivations for different types of coding revolved on
optimizing one or more of:
Power efficiency - usually expressed as BER vs Eb/No - with Capacity
being the benchmark metric
Bandwidth Efficiency - Decent performance without high code overhead.
Metrics here are a bit different as it involves a value judgement
between the error efficiency metric and the bandwidth efficiency
metric.
Packet Error Rate (PER) or Block Error Rate (not to be confused with
Bit Error Rate)- For many network or batch applications or
applications with an access protocol, transmission error metrics may
be more important than power efficiency metrics. Sometimes these
metrics are more important for block-oriented codes.
Complexity - If you can't build it or fit it in your system or the
circuit takes too much power to run, it isn't of much value. This
involves a judgement about various things, but it is an important
consideration in many cases. I'm including power consumption in this
tradeoff.
Latency - If it takes forever to decode, it may not be useful.
So the motivations that led to the "advances" in each may have been
subtle or big or obvious or not, and sometimes were unrelated between
different code types. Also, different codes work for different types
of error distributions, e.g., block codes like BCH or Reed-Solomon
tend to be used when there are clumped errors, but convolutional codes
may be more desirable when the error rate is higher but the errors are
more randomly distributed. Output error distributions matter, too,
so there are all kinds of different evaluation criteria that get
applied, and in any particular "innovation" it's important to
understant what motivated the advance in order to understand where it
may be or not be relevant.
Anyway, it's a complex topic with a lot of interesting twists and
perspectives and if you're really interested in it I'll suggest
researching different histories. There were a number of good
historical surveys published in the late 90s after Turbo Codes were
getting accepted as finally having reached the Holy Grail of reaching
Capacity, since that was recognized as a historically significant
step. Some of those surveys did a really good job of capturing how
the technology had advanced to that point...I wish I had a link or
reference for you, but I don't.
Reply by Steve Pope●June 10, 20192019-06-10
RichD <r_delaney2001@yahoo.com> wrote:
>Recently I attended a coding seminar. The speaker
>very briefly reviewed the relevant history;
>first Hamming codes, ~1960: BCH codes, ~1970: convolutional,
>1995: polar, 2002: LDPC
>Clue me in - what was the advance in each case?
>I worked on a BCH project once, that seemed fairly
>efficient, how is it, or other algorithms, deficient?
I think you need to check out Berlekamp, "Key Papers in the Development of
Coding Theory", IEEE Press. Especially the introduction. But,
it does not cover developments as recent at Polar and modern LDPC
codes.
BCH codes were significant in that one could, for the first time,
systematically construct native codes of a given redundancy and
distance that had an algebraic decoding method.
Steve