## How to calulate the autocorrelation of a 3-level line code ?

Started by 6 years ago●4 replies●latest reply 6 years ago●263 viewsHi,

Does anyone know how, or indeed if it is possible to calculate the autocorrelation function of the MMS43 variant of the 4B3T https://en.wikipedia.org/wiki/4B3T line code ? I can approximate the autocorrelation by simply implementing the encoder and generating a long sequence of coded data and then taking the mean of delay-shifted versions but I would like a precise calculation rather than an approximation.

For example, for the MLT-3 code (https://en.wikipedia.org/wiki/MLT-3_encoding), there are just 4 states, I can write out the 4x4 Transition probability matrix,Pi, write out the 4x4 output product matrix associated with each transition,C, the 4x1 steady state probability vector associated with each state p, and then the autocorrelation at each delay is simply

Rxx(t) = sum(Pi^t.*C.p)

It seems I can't apply the same approach to the 4B3T code because the 4B3T state machine transitions depend on the inputs and the current state (Mealy model) rather than just the current state (Moore mode) as is the case for MLT-3. So does anyone know, is it possible to calculate the autocorrelation function for the 4B3T line code ?

Thanks,

Usjes.

On the web page for the 4b3t it says "Decoding is simpler, as the decoder does not need to keep track of the encoder state". That would imply you can build a 16 x 16 matrix.

I don't see why the delay-shifted calculation is an "approximation". That is the definition of autocorrelation.

For modeling the system, you could also create a 27x27 matrix so every possible ternary code is mapped to every possible error. Your probability matrix is then also 27x27, with a lot of entries that are close to zero. That takes into account the duplicate codes. So you could then reduce the sum by adding the duplicate codes together. It would be a "compression" to a 16x16 result. I bet you can create a compression matrix that is 27x16 that does the sums for you.

Something to think about anyway. Thank you for mentioning this coding system. I've never heard of it before!

Mike

Yes the definition of the autocorrelation is the averaging over *all time, *and assuming it is an ergodic process this is equal to the averaging over an infinite number of instances of the underlying random process. So for MLT3 coding I can do the probability calculations as outlined above and get an exact answer and then if I do time averaging of necessarily *finite *sequences of MLT3-coded data they approximate this exact answer and I can see the approximation getting closer and closer to the exact result as I increase the duration of the sequences being averaged. This is why Tim Wescott (below) suggests simulating a really *LONG* sequence. For the 4B3T case I only have the approximation and I would like to know if there is a way of calculating the exact result. I haven't looked at block coding in a very long time so I just can't remember if there is a standard method for calculating the autocorrelation of a sequence of symbols coded with a block code.

Real messages are finite. I guess I don't see the point of a theoretical description when you can do millions of autocorrelation examples per second with something like a GPU or FPGA. Not that I don't see the point of theory, I love General Relativity.

There must be some range over which the autocorrelation makes sense. Why would you combine a sequence from last week (or last year) with one from now? So the data rate has to play into this somehow. How does the theory deal with that?

I'm asking because I'm trying to understand the problem. I'm missing something here.

I think that it would be very tedious and not at all straightforward. But it would be tedious and not straightforward to try to turn my intuition into something that I can communicate in words.

Make a really LONG simulation and autocorrelate that...