DSPRelated.com
Forums

Decoder for Tail-Biting Convolutional Code

Started by Unknown October 25, 2005
Could someone please tell me how to implement a decoder for a
tail-biting convolutional code?  I've been working on it all day and
can't figure it out.

My basic approach bas been to take a Viterbi decoder and modify it such
that it does not assume the initial state is 0.  This works about 50%
of the time.  When it fails, it only gets 1 or 2 bits wrong and they're
toward the very beginning of the message.

One thought I had is to use the output of my dysfunctional decoder to
find out what the first few bits of the message are and initialize my
Viterbi decoder to the corresponding state.  Because the bits used to
initialize the encoder are actually at the end of the message, I have
high hopes this will work reliably.

But I believe there has to be a better way.  Can anyone point me in the
right direction?  I'm using a 1/2 rate, constraint length 7 encoder
initialized with the first 6 bits of message.  The bits of the message
are then fed through the decoder in the following order: 7, 8, 9, ...,
n - 1, n, 1, 2, 3, 4, 5, 6.

Thanks in advance.

Bill Woessner

Check H. H. Ma and J. K. Wolf, �On tail biting convolutional codes,� IEEE
Transactions on Communications.

>Could someone please tell me how to implement a decoder for a >tail-biting convolutional code? I've been working on it all day and >can't figure it out. > >My basic approach bas been to take a Viterbi decoder and modify it such >that it does not assume the initial state is 0. This works about 50% >of the time. When it fails, it only gets 1 or 2 bits wrong and they're >toward the very beginning of the message. > >One thought I had is to use the output of my dysfunctional decoder to >find out what the first few bits of the message are and initialize my >Viterbi decoder to the corresponding state. Because the bits used to >initialize the encoder are actually at the end of the message, I have >high hopes this will work reliably. > >But I believe there has to be a better way. Can anyone point me in the >right direction? I'm using a 1/2 rate, constraint length 7 encoder >initialized with the first 6 bits of message. The bits of the message >are then fed through the decoder in the following order: 7, 8, 9, ..., >n - 1, n, 1, 2, 3, 4, 5, 6. > >Thanks in advance. > >Bill Woessner > >
Check H. H. Ma and J. K. Wolf, �On tail biting convolutional codes,� IEEE
Transactions on Communications.

>Could someone please tell me how to implement a decoder for a >tail-biting convolutional code? I've been working on it all day and >can't figure it out. > >My basic approach bas been to take a Viterbi decoder and modify it such >that it does not assume the initial state is 0. This works about 50% >of the time. When it fails, it only gets 1 or 2 bits wrong and they're >toward the very beginning of the message. > >One thought I had is to use the output of my dysfunctional decoder to >find out what the first few bits of the message are and initialize my >Viterbi decoder to the corresponding state. Because the bits used to >initialize the encoder are actually at the end of the message, I have >high hopes this will work reliably. > >But I believe there has to be a better way. Can anyone point me in the >right direction? I'm using a 1/2 rate, constraint length 7 encoder >initialized with the first 6 bits of message. The bits of the message >are then fed through the decoder in the following order: 7, 8, 9, ..., >n - 1, n, 1, 2, 3, 4, 5, 6. > >Thanks in advance. > >Bill Woessner > >