←back to thread

259 points zdw | 1 comments | | HN request time: 0.198s | source
Show context
dmitrygr ◴[] No.41832956[source]
This is 100% nonsense. Phase noise exists too, not just amplitude noise.

The answer is actually rather simple. AM stations are limited to 10KHz band width. FM gets 200KHz. More bandwidth allows representing a higher fidelity signal…

replies(5): >>41833000 #>>41833015 #>>41833074 #>>41833260 #>>41833414 #
1. kragen ◴[] No.41833414[source]
It's not 100% nonsense, though it's true that phase noise does exist. FM radio can transmit silence, which gives it a better dynamic range, which is important for music. If your AM radio signal is 10dB stronger than the radio noise in the band, you'll get noise in the demodulated signal only 10dB quieter than the signal. Due to the so-called "capture effect" https://en.wikipedia.org/wiki/Capture_effect the effect on an FM-demodulated radio signal is potentially much less—though it's true that, with narrowband FM, it won't be.

That's why commercial FM broadcasting uses a ±75kHz deviation even though it was originally only transmitting audio of ≤20kHz. Adding all this extra bandwidth to an AM station wouldn't actually help, because beyond ±20kHz, you're only improving your radio station's ability to reproduce ultrasound. But it does help FM; it greatly reduces the amplitude of demodulated noise, because, even without a PLL, the frequency deviation caused by additive white noise increases much more slowly with bandwidth than the frequency deviation you can use for your signal. With a PLL, I think the frequency deviation caused by additive white noise basically doesn't increase at all with bandwidth. (I guess I should simulate this; it should be pretty easy.)

Unfortunately neither Cook's article nor the flashlight analogy explains any of this.