Forums

Viterbi decoding

Started by Elnaz April 10, 2012
I am using "vitdec" in Matlab with 'trunc' and 'unquant' propoerties
for decoding real-valued data in between almost -2.5 and 2.5 (i.e.
binary values with interference and noise). The interesting problem is
that I'm having intervals of %100 correct detections and %100 wrong
decisions (bits are flipped) in my decoded stream. It's like the
detector changes sign and switches between two intervals for some
reason. If I can prevent this switching that means %100 accurate
detection. I don't know how to make sense of this and why this is
happening.  Any ideas?
On Apr 10, 7:24=A0pm, Elnaz <ebsadegh...@gmail.com> wrote:
> I am using "vitdec" in Matlab with 'trunc' and 'unquant' propoerties > for decoding real-valued data in between almost -2.5 and 2.5 (i.e. > binary values with interference and noise). The interesting problem is > that I'm having intervals of %100 correct detections and %100 wrong > decisions (bits are flipped) in my decoded stream. It's like the > detector changes sign and switches between two intervals for some > reason. If I can prevent this switching that means %100 accurate > detection. I don't know how to make sense of this and why this is > happening. =A0Any ideas?
Use differential encoding and decoding so that information is carried by the transitions or lack thereof instead of the raw value. At the encoder, transmit c_n =3D c_{n-1} XOR d_n where d_n is the data bit. At the decoder, recover d_n =3D r_n XOR r_{n-1} where r_n's are the outputs of the Viterbi decoder. Note that d_n is recovered correctly if r_n and r_{n-1} are both correct or both flipped. are bo