I would like to know the relevance of peak to average power ratio and what is the ideal (or preferred) value? #OFDM
The relevance is to budget headroom to saturation in the Power Amplifier. A higher PAPR means a higher backoff from peak output is required to maintain linearity in the PA, and therefore a reduced average power would be realized in the output signal. Reducing the average output power reduces the range, so that is often undesirable.
For the same average power, power amplifiers that can handle higher peak powers cost more and are harder to get right. There's a big bump from an amplifier designed for constant power to one designed for any non-unity PAPR, and then things just get more expensive from there.
From the perspective of the PA, the preferred PAPR is \(1:1\).
On the other hand, there's some really nifty communications schemes (e.g. OFDM) that inherently have high PAPR, and whose performance, in the absence of considerations of the PA, just goes up with PAPR.
So from the ivory-tower comms mathematician perspective, the preferred PAPR is \(1:\infty\)
Somewhere in the middle of that are some very harried systems architects trying to find a solution with the best bang for the buck.
Here is a post on the statistics of PAPR using the Complementary Cumulative Distribution Function (CCDF). It's not as complicated as the name sounds.
I guess you mean LTE. It Depends on type of PA & any predistortion. We used to target PAR of 6 dB ish
Starting with the definition which I grabbed off Wikipedia because it's handy:'
The peak-to-average power ratio (PAPR) is the peak amplitude squared (giving the peak power) divided by the RMS value squared (giving the average power).
What's important here in answering your question about relevance is the "peak amplitude". And, perhaps a bit about "average power" is in order as well.
"Average power" is going to be about how a system is being used or loaded. It can be anything within the realm of possibility. Some systems operate fairly much in steady state so that the average power is a constant. That need not be the case. The key here is the averaging time used in defining or measuring "average". If there is steady state and the average power is low then it hardly matters it seems to me. Well, until you state the PAPR as a ratio. Then it matters because the lower the average, the higher the ratio.
Peak power is an instantaneous measure or at least a measure over a much shorter period of time. The question here is how you might choose to define it:
- Power level at full saturation?
- Power level where the amplification at a single frequency is 3dB below the amplification factor for small signals?
So you need to define both to suit your needs I should think.
It's largely a cost thing. Very high power amplifiers are very expensive partly because of the expensive high-power parts, and partly because at the very highest power levels, a great deal of (costly) R&D goes into them. So, you have to justify the additional cost of allowing that already challenging design to be capable of supporting anything approaching 0dB of PAR because those signals simply don't (usually) exits in the real world. 6dB would probably not be possible in the every highest power models, even the very expensive ones. A 12dB PAPR would not be unusual in such models for good reason; music signals rarely breach this unless heavily compressed. Even a constant sinewave has a PAPR of 3dB. You would need a continuous squarewave to hit 0dB.
Another good reason that amplifiers don't usually perimit anything as low as 0dB is how do you for example produce 20kW of power from a regular mains supply? Such a specification is not (usually) a lie; it's just that such power cannot be sustained continuously (since it is usually the result of running the amplifier from a large capacitor bank which gets topped us by the mains supply).
So I think the answer is probably 12dB, because you don't usually need anything better, and if you do, you will pay more for it than you need to.
Usually OFDM signal (LTE kind:1280 carriers) have 12-13dB of PAPR. This PAPR has to be reduced in order to optimize the operating range of the Power Amplifier.
Actually you have two ways that you use sequentially:
- Crest Factor Reduction (CFR) to reduce the PAPR at the expense of some EVM
-Digital Predistortion (DPD) to use part of the non-linear portion of the PA
Realistic hardware implementation (depending on the modulation, single carrier, multi-carrier, frequency hopping, ...) have target at 6-8 dB of PAPR. This is good enough to use a DPD algorithm on this reduced PAPR signal.
As said, OFDM and other modulation schemes show an inherently high PAR, ie. you have that naughty peak at times. This will be clipped by your cheap power amplifier, thus resulting in an error burst. A scrambler will spread the errored bits over some symbols and the Reed Solomon correction will give you a clean, error free bitstream.
That is the way DSL things work.
But what is the reason for high PAR for OFDM? How to do better? No idea. Are there some literature pointers?
I agree with Tim on his best ever practical reply:
"From the perspective of the PA, the preferred PAPR is 1:1
So from the ivory-tower comms mathematician perspective, the preferred PAPR is 1:∞1:∞"
Apologies Tim, It is all down to that beer
ofdm signal being the sum of sine waves at different phases & amplitudes has inherently high PAR. Reduction methods are possible using some inhouse techniques of gently scaling peaks down more than rest of signal until a target PAR. This is followed by predistortion. A target PAR say of 6dB from original 10dB or so can be achieved prior to predistortion but an eye is kept on spectrum sides(SNR) not to get bad. but a PAR of 1:1 or negative?? is just fun.