DSPRelated.com
Forums

strange performance of convolution decoder for burst errors

Started by Viraj May 17, 2006
Hi,
I am deveoping a fec system. i am using convolution codes (K=7, code
rate=1/2). i did some simulations in MATLAB to test convolution
decoder's performance against burst errors. I took 128 bit frame of
random data. Then used convenc function to encode it. and then
introduced burst errors by selecting some bit randomly and making
following bits zero (total number of consecutive bits which become zero
= burst length chosen). Then i decoded it using vitdec function. i did
this for 10000 frames and for different burst length. I kept burst
length constant for one iteration over 10000 frames. I varied burst
lenght from 4 to 24 in steps of 2. i was expecting that BER will go up
monotonically as burst length increses. but result was different. BER
increased as burst length incresed from 4 to 10. But then BER decreased
when burst length increased from 10 to 16. And BER again increased
after that. I am confused with such behaviour. Can anybody explain this
behaviour?

Actually i want to test if use of interleaver improves convolution
decoder's performance against burst errors.
Thank you,
I can send matlab *.m file if required.
-Viraj