DSPRelated.com
Forums

How/Why a digital signal is inherently more immune to noise?

Started by a.s. May 20, 2015
The primary advantage of digital transmission over analog transmission is
Noise Immunity. A digital signal (in comparison to an analog signal) is
inherently more immune to noise. Why the word "inherent"? 

Please make it more clear to me.



---------------------------------------
Posted through http://www.DSPRelated.com
"a.s." <104988@DSPRelated> writes:

> The primary advantage of digital transmission over analog transmission is > Noise Immunity. A digital signal (in comparison to an analog signal) is > inherently more immune to noise. Why the word "inherent"? > > Please make it more clear to me.
Hi, I am not certain your presumption is correct, in theory. What we have with the current technology are myriad digital error detection/correction schemes (turbo codes, LDPC codes, linear block codes, etc.), but few or no analog error detection/correction schemes. So with the prevailing technology digital is superior. However I don't think there is anything inherent in information theory that excludes the possibility of analog schemes. -- Randy Yates Digital Signal Labs http://www.digitalsignallabs.com
On 5/20/2015 4:57 PM, a.s. wrote:
> The primary advantage of digital transmission over analog transmission is > Noise Immunity. A digital signal (in comparison to an analog signal) is > inherently more immune to noise. Why the word "inherent"? > > Please make it more clear to me.
An analog signal conveyed over an analog channel will be combined with noise in the channel at the receiving end and so the noise level will increase. There is a certain amount of noise introduced by conversion from analog to digital and back, but otherwise a digital signal can be transmitted over a channel then separated from the channel noise. In the end the noise level in the digital signal does not increase by transmission. Separating the digital signal from the noise is done by providing spacing between the digital levels so that the noise does not cause changes in the digital signal. This is never 100% with truly random noise since the noise level does not have a maximum value. However, noise levels high enough to corrupt the digital signal can be made arbitrarily infrequent to meet any specification by adjusting the digital levels (transmitted channel power) and the bit rate. Slower bit rates allow longer averaging periods and lower bit error rates for a given power level. -- Rick
On 2015-05-20 22:57, a.s. wrote:
> The primary advantage of digital transmission over analog transmission is > Noise Immunity. A digital signal (in comparison to an analog signal) is > inherently more immune to noise. Why the word "inherent"? > > Please make it more clear to me.
I think the point is that you can detect a bit *exactly*, if the noise level is below the detection threshold (think of TTL or CMOS, for example). In case of analog, the noise affects the signal in a continuos way, in case of digital either it does not affect at all or it can change the data. This means the digital is inherently more immune to noise, since digital will not affected by noise at all (up to a point). Anyway, maybe I misunderstood the question... bye, -- piergiorgio
Randy Yates <yates@digitalsignallabs.com> writes:

> "a.s." <104988@DSPRelated> writes: > >> The primary advantage of digital transmission over analog transmission is >> Noise Immunity. A digital signal (in comparison to an analog signal) is >> inherently more immune to noise. Why the word "inherent"? >> >> Please make it more clear to me. > > Hi, > > I am not certain your presumption is correct, in theory. > > What we have with the current technology are myriad digital error > detection/correction schemes (turbo codes, LDPC codes, linear block > codes, etc.), but few or no analog error detection/correction schemes. > So with the prevailing technology digital is superior. > > However I don't think there is anything inherent in information theory > that excludes the possibility of analog schemes.
http://arxiv.org/pdf/1105.1561.pdf -- Randy Yates Digital Signal Labs http://www.digitalsignallabs.com
Randy Yates <yates@digitalsignallabs.com> wrote:
> "a.s." <104988@DSPRelated> writes:
>> The primary advantage of digital transmission over analog transmission is >> Noise Immunity. A digital signal (in comparison to an analog signal) is >> inherently more immune to noise. Why the word "inherent"?
(snip)
> I am not certain your presumption is correct, in theory.
> What we have with the current technology are myriad digital error > detection/correction schemes (turbo codes, LDPC codes, linear block > codes, etc.), but few or no analog error detection/correction schemes. > So with the prevailing technology digital is superior.
> However I don't think there is anything inherent in information theory > that excludes the possibility of analog schemes.
Maybe not information theory, but physics. You can't get away from thermal noise above absolute zero. In a large system, or long cable, thermal noise adds up along the way. You can reduce the effect by increaing the amplitude of the signal, but you can't get rid of it. With digital, you can regenerate the signal every so often, before the noise gets too big. -- glen
On 5/20/2015 7:22 PM, glen herrmannsfeldt wrote:
> Randy Yates <yates@digitalsignallabs.com> wrote: >> "a.s." <104988@DSPRelated> writes: > >>> The primary advantage of digital transmission over analog transmission is >>> Noise Immunity. A digital signal (in comparison to an analog signal) is >>> inherently more immune to noise. Why the word "inherent"? > > (snip) >> I am not certain your presumption is correct, in theory. > >> What we have with the current technology are myriad digital error >> detection/correction schemes (turbo codes, LDPC codes, linear block >> codes, etc.), but few or no analog error detection/correction schemes. >> So with the prevailing technology digital is superior. > >> However I don't think there is anything inherent in information theory >> that excludes the possibility of analog schemes. > > Maybe not information theory, but physics. You can't get away from > thermal noise above absolute zero. In a large system, or long cable, > thermal noise adds up along the way. You can reduce the effect > by increaing the amplitude of the signal, but you can't get > rid of it. > > With digital, you can regenerate the signal every so often, > before the noise gets too big.
Isn't the nature of noise such that you will always have some probability of corruption of the digital symbols? In the end there is a tradeoff between bandwidth, resolution (noise level) and power level regardless of whether it is analog or digital? I think the real difference is just that with the technology we have it is easier to achieve very low error rates with digital techniques while analog can accomplish the same result, but with more circuitry and likely more power. -- Rick
In fact wideband FM is a form of analog noise rejection.
Mark
rickman wrote:
> On 5/20/2015 7:22 PM, glen herrmannsfeldt wrote: >> Randy Yates <yates@digitalsignallabs.com> wrote: >>> "a.s." <104988@DSPRelated> writes: >> >>>> The primary advantage of digital transmission over analog >>>> transmission is >>>> Noise Immunity. A digital signal (in comparison to an analog signal) is >>>> inherently more immune to noise. Why the word "inherent"? >> >> (snip) >>> I am not certain your presumption is correct, in theory. >> >>> What we have with the current technology are myriad digital error >>> detection/correction schemes (turbo codes, LDPC codes, linear block >>> codes, etc.), but few or no analog error detection/correction schemes. >>> So with the prevailing technology digital is superior. >> >>> However I don't think there is anything inherent in information theory >>> that excludes the possibility of analog schemes. >> >> Maybe not information theory, but physics. You can't get away from >> thermal noise above absolute zero. In a large system, or long cable, >> thermal noise adds up along the way. You can reduce the effect >> by increaing the amplitude of the signal, but you can't get >> rid of it. >> >> With digital, you can regenerate the signal every so often, >> before the noise gets too big. > > Isn't the nature of noise such that you will always have some > probability of corruption of the digital symbols?
Yes. But for something like Ethernet, BER of 10^-9 is pretty noisy. Some channels just don't have a lot of noise.
> In the end there is a > tradeoff between bandwidth, resolution (noise level) and power level > regardless of whether it is analog or digital? >
There is. It's described by the Shannon channel capacity calculations. What's different is that digital offers error correcting codes and other things to tune the tradeoffs.
> I think the real difference is just that with the technology we have it > is easier to achieve very low error rates with digital techniques while > analog can accomplish the same result, but with more circuitry and > likely more power. >
There is the concept of "coding gain". -- Les Cargill
On Wed, 20 May 2015 18:47:56 -0700, makolber wrote:

> In fact wideband FM is a form of analog noise rejection. > Mark
+1. And it doesn't need an excess of circuitry compared to digital (you can build an FM receiver with fewer active elements than for any FEC- corrected digital stream, I think). It's beastly hard to analyze mathematically, though, and I'm pretty sure it can't be done symbolically -- you can either get exact answers that are upper & lower bounds on performance, or you can get approximate answers or simulation results, but you can't get both. -- www.wescottdesign.com