How do synchronization for Frequency selective fading channels differ from AWGN channels.
The reason is many algorithms work for AWGN does not work for fading channels.How do we ensure the algorithm will work for fading channels without simulation?
It can depend on the type of modulation and the type of fading, but generally robust algorithms will work in either (maybe that's just the definition of a robust algorithm).
e.g., OFDM systems are generally engineered for multipath channels, so tend to work well there and don't suffer in AWGN.
A single-carrier demodulator with good acquisition algorithms will generally find the strongest ray in a multipath channel, and then an EQ may work on the rest of the energy. If the channel is dynamic and the strongest ray fades and another ray become dominant, this can create issues, but they usually can be handled.
Basically, you need to know the expected channel impairments and design for what you need for the particular type of modulation being used. It can differ significantly from case to case. Do you have a particular case in mind?
I am working for OFDM system.The auto cyclic prefix correlation based OFDM symbol boundary detection and and ineteger frequency offset detection while work
for AWGN do not work for frequency selective fading when the delay spread is 70% of cyclic prefix.