Hi
After channel coding , why do we get coding gain ?
To elaborate - After coding, the energy per bit is less and hence
probability of error for (coded) bit is more. Still, the P_e for
the message bit is less. Alternatively, we can operate with C_g dB
less snr and still have the same P_e as before coding. ( C_g is coding
gain )
Where does the coding gain arise from ? ( Is it due to the increased
duration of the "message bit waveform " ? ( I have put it in quotes since
there will not be a direct mapping between the message bit and its
waveform . )
(If yes, what happens when we operate with bandwidth expansion after
coding ? )
I have also asked few researchers why repetition code does not give coding
gain. Still no answer.
ps - message word/bits , after coding, becomes code word/bits.
tia
shankar
channel coding gain
Started by ●April 14, 2004
Reply by ●April 14, 20042004-04-14
kbc32@yahoo.com (kbc) writes:> Hi > > After channel coding , why do we get coding gain ? > > To elaborate - After coding, the energy per bit is lessWhere do you get this from? -- Randy Yates Sony Ericsson Mobile Communications Research Triangle Park, NC, USA randy.yates@sonyericsson.com, 919-472-1124
Reply by ●April 14, 20042004-04-14
kbc wrote:> Hi > > After channel coding , why do we get coding gain ? > > To elaborate - After coding, the energy per bit is less and henceOh? How so? ...> I have also asked few researchers why repetition code does not give coding > gain. Still no answer.Doubling (for example) the number of bits doubles the number of bits that can be corrupted. If you ship twice as much product using twice as many trucks, would you call that shipping gain? Jerry -- Engineering is the art of making what you want from things you can get. �����������������������������������������������������������������������
Reply by ●April 14, 20042004-04-14
I feel it is convenient to consider in terms of distance. two uncoded message words are too close to be easily mistaken for each other. increasing signal power increases distance. but increasing power is an inefficient way to do this because the distance achieved per watt is too meagre compared to that achieved in an intelligent manner such as channel coding or diversity. so u go for channel coding schemes that introduce redudancy(more no. of bits and hence bandwidth expansion) to improve distance. and again u see once scheme could achieve distance more efficiently than the other. for example, the repetition coding scheme u mentioned is too inefficient. so it all depends on how intelligent u are. also u shouldnt forget that there are channel coding shemes like trellis codes that do not need any bandwidth expansion but still gives good distances. for me, the gain offered by such codes is only true coding gain. the coding gain definition, apart from "same ber" should also include "same bandwidth". only then we would know how far are we from shannon's limit. bala Jerry Avins <jya@ieee.org> wrote in message news:<407d5241$0$2776$61fed72c@news.rcn.com>...> kbc wrote: > > > Hi > > > > After channel coding , why do we get coding gain ? > > > > To elaborate - After coding, the energy per bit is less and hence > > Oh? How so? > > ... > > I have also asked few researchers why repetition code does not give coding > > gain. Still no answer. > > Doubling (for example) the number of bits doubles the number of bits > that can be corrupted. If you ship twice as much product using twice as > many trucks, would you call that shipping gain? > > Jerry
Reply by ●April 14, 20042004-04-14
amara vati wrote:> ... for me, the gain offered by > such codes is only true coding gain. the coding gain definition, apart > from "same ber" should also include "same bandwidth". only then we > would know how far are we from shannon's limit.... For a good measure, you need to normalize, and "bits made good" (used in the same sense as a sailor's "ground made good"; actual information bit rate) is a good basis for normalizing. A fantastic increase in BER that costs only a little extra bandwidth still represents real coding gain. Jerry -- Engineering is the art of making what you want from things you can get. �����������������������������������������������������������������������
Reply by ●April 14, 20042004-04-14
Jerry Avins averred:> A fantastic increase in BER that costs only a little > extra bandwidth still represents real coding gain.but I hope that he does not really believe this! :-) :-) :-) --Dilip Sarwate
Reply by ●April 15, 20042004-04-15
Dilip Sarwate wrote:> > Jerry Avins averred: > >>A fantastic increase in BER that costs only a little >>extra bandwidth still represents real coding gain. > > > but I hope that he does not really believe this! > :-) :-) :-) > > --Dilip SarwateI did until now. I'm too ignorant to see why not. Doesn't it represent a real increase in channel capacity? Jerry -- Engineering is the art of making what you want from things you can get. �����������������������������������������������������������������������
Reply by ●April 15, 20042004-04-15
Dilip Sarwate wrote:> > Jerry Avins averred: > >>A fantastic increase in BER that costs only a little >>extra bandwidth still represents real coding gain. > > > but I hope that he does not really believe this! > :-) :-) :-) > > --Dilip SarwateOh no! Not increase; improvement. Ouch! Jerry -- Engineering is the art of making what you want from things you can get. �����������������������������������������������������������������������
Reply by ●April 15, 20042004-04-15
Hello Here is my take on it. The Shannon capacity for a AWGN channel is defined by C=W*log(1+P/(N0*W)). One can see that we get a linear increase in capacity with coding, but only a logarithmic increase with increase in P. So what a code does to reach close to capacity is simple. The addition of an (n,k) code implies that the bandwidth has increased by a factor of (n/k). However, the transmit power P is unchanged since the energy per bit has decreased by a factor of (n/k), while the transmit rate has increased by a factor of (n/k). So the transmit power is unchanged. The increase in W implies that the effective SNR has decreased but the linear increase in W outside the bracket overcomes this effect. The net result is an increase in capacity with bandwidth expansion, which is effectively the outcome of inserting a channel code. such codes is only true coding gain. the coding gain definition, apart> > from "same ber" should also include "same bandwidth". only then we > > would know how far are we from shannon's limit.This statement is not correct. Coding=>bandwidth expansion. One achieves performance improvement (BER) using a higher bandwidth. The improvement achieved using bandwidth expansion is measured by the coding gain which is the difference between the Eb/No values with and without coding for a given probability of error. This implies that one can potentially achieve a given BER at a lower Eb/No by employing higher bandwidth transmission. Thanks Vikram Jerry Avins <jya@ieee.org> wrote in message news:<407d8679$0$2782$61fed72c@news.rcn.com>...> amara vati wrote: > > > ... for me, the gain offered by > > such codes is only true coding gain. the coding gain definition, apart > > from "same ber" should also include "same bandwidth". only then we > > would know how far are we from shannon's limit. > ... > > For a good measure, you need to normalize, and "bits made good" (used in > the same sense as a sailor's "ground made good"; actual information bit > rate) is a good basis for normalizing. A fantastic increase in BER that > costs only a little extra bandwidth still represents real coding gain. > > Jerry
Reply by ●April 15, 20042004-04-15
Vikram Chandrasekhar wrote:> Hello > > Here is my take on it. The Shannon capacity for a AWGN channel is > defined by > C=W*log(1+P/(N0*W)). One can see that we get a linear increase in > capacity with coding, but only a logarithmic increase with increase in > P. > > So what a code does to reach close to capacity is simple. The addition > of an (n,k) code implies that the bandwidth has increased by a factor > of (n/k). However, the transmit power P is unchanged since the energy > per bit has decreased by a factor of (n/k), while the transmit rate > has increased by a factor of (n/k). So the transmit power is > unchanged. The increase in W implies that the effective SNR has > decreased but the linear increase in W outside the bracket overcomes > this effect. The net result is an increase in capacity with bandwidth > expansion, which is effectively the outcome of inserting a channel > code. > > such codes is only true coding gain. the coding gain definition, apart > >>>from "same ber" should also include "same bandwidth". only then we >>>would know how far are we from shannon's limit. > > > This statement is not correct. Coding=>bandwidth expansion. One > achieves performance improvement (BER) using a higher bandwidth. The > improvement achieved using bandwidth expansion is measured by the > coding gain which is the difference between the Eb/No values with and > without coding for a given probability of error. This implies that one > can potentially achieve a given BER at a lower Eb/No by employing > higher bandwidth transmission. > > Thanks > Vikram... But after coding, one can return to the former bandwidth by reducing the data rate. (This has the effect of keeping the energy per bit constant if the transmitted power is unchanged.) With suitable coding, there is an increase in "bits made good" despite the lower data-bit rate. We see this in practice with modems at fixed-bandwidth land lines. Especially when the lines are noisy, error correction reduces the data-bit rate (because some data bits are replaced by ECC bits), put throughput rises. Jerry -- Engineering is the art of making what you want from things you can get. �����������������������������������������������������������������������






