DSPRelated.com
Forums

Soft-output Hamming decoder

Started by cooltafel 3 years ago1 replylatest reply 3 years ago227 views

Hello,

I am implementing an SISO Hamming decoder in Matlab for (4,7) and (4,8) Hamming. The soft-input is coming from a soft demodulator, and the soft values are used to correct distorted codewords. The correction part of the decoder works like a charm, however, calculating the new soft-values for the corrected codewords goes wrong. The new soft-values do not match the corrected codeword bits. The bits are correct for sure since simulations have proved that the BER improved significantly. 

I am using this paper as reference: Soft Hamming decoder

In this paper, the soft-outputs are calculated with the use of the so called error pattern. First the incoming LLRs are converted to probabilities and these probabilities are then used to determine the new bit probabilities for the codeword bits. In fact, this procedure works sometimes, but most of the times not at all. I have seen that the problems might arise due to the magnitudes of the incoming LLRs, because the probabilities are mainly almost 1 which in the calculation introduces 0's in the product, resulting in 0 probabilities... 

% Probabilities calculated with the LLR-values
probability = exp(abs(LLR))./(1+exp(abs(LLR)));

% Hard decoding of the LLR (no correction)
data = (sign(-LLR)+1)/2;
data_old = data;

The relevant Matlab code snippet:

% Column-wise multiplication
        P = abs(E_pattern - repmat(probability(i,:), size(E_pattern,1), 1));
        P_prod = prod(P, 2).';
        % Normalize the probabilities row-wise
        P_sum = sum(P_prod,2);
        P_norm = P_prod ./ sum(P_sum,2);
                
        for k = 1:PPM
            % Weight
            Weight = mod(data_old(i,k) + E_pattern(:,k),2);
            % New probabilities codeword bits
            P_zero(k) = sum(P_norm(Weight(:) == 0));
        end
                
        % New LLR values codeword bits
        LLR(i,:) = log(P_zero./(1-P_zero));

Until now, I have tried normalization, quantization of the incoming LLRs to calculate the probabilities, but this does not work. I don't really know what to do. There isn't much literature on what to do when the probabilities don't work out. Could somebody help me with solving this problem? I need the soft values for upfollowing calculations. 

Thank you in advance!!!

#Matlab

[ - ]
Reply by Mannai_MuraliApril 28, 2021

There are logical mistakes in your code.First you have to get syndrome from soft LLRs by taking hard decision.From these n tuples get syndrome consisting of n-k bits.For each syndrome the error pattern table is there.One such error pattern for one syndrome is given in the paper.For every j=1 to n form a tentative decision based on LLR.

Your probabilty vector is correct which must be done for each of the n bits.

In your code under column wise multiplication is NOT correct.You have to multiply probability if error pattern=0 and 1-probabilty if error pattern=1 for jth bit. 

For Final softoutput for j th bit=0  consider any row i of error pattern ONLY if Error pattern=1 and tentative bit=1 OR Error pattern=0 and tentative bit=0.

Also Final LLR output for jth bit is based on ALL rows of Error pattern (For given syndrome calculated) NOT one row 'i'.

If you make a skype call or EMail I can explain.

Skype:Mannai_Murali

EMail:Mannai_Murali@hotmail.com