If one were to measure RF carrier-to-noise ratio (CNR) and baseband video signal-to-noise ratio (SNR) for the same analog TV channel, how would the numbers compare? This definitely falls into the "it depends" category.

Much has been written over the years about CNR vs. baseband video SNR, but the definitive reference is the 1974 National Cable Television Association (now National Cable & Telecommunications Association) paper by T.M. Straus, "The Relationship between the NCTA, EIA, and CCIR Definitions of Signal-to-Noise Ratio." This paper also appeared in the IEEE Transactions on Broadcasting, Vol. BC-20, No. 3, September 1974. NCTA Recommended Practices for Measurements on Cable Television Systems includes an appendix titled "The Relationship of Signal-to-Noise in a VHF Cable System to the Signal-to-Noise in the Video System." Both the Straus paper and NCTA appendix thoroughly cover the gnarly math behind the relationship between CNR and baseband video SNR, so I’ll spare you and CT‘s editors the grief of trying to wade through the calculations. Definitions NCTA Recommended Practices for Measurements on Cable Television Systems defines visual CNR as "the power in a sinusoidal signal, whose peak is equal to the peak of a visual carrier during the transmission of synchronizing pulses, divided by the associated system noise power in a four megahertz bandwidth. This ratio is expressed in dB." Modern Cable Television Technology, 2nd Ed., states, "Carrier-to-noise ratio (C/N) is defined as follows: C/N(dB) ≡ 10log(c/n), where c and n are scalar power levels of the carrier and noise, respectively."

Translation: CNR is a predetection measurement – that is, one made at RF. It’s the difference, in decibels, between the amplitude – specifically peak envelope power – of an analog TV channel’s visual carrier and the root mean square (RMS) amplitude of system noise in a specified noise power bandwidth. The latter is 4 MHz for NTSC analog TV channels and also is known as modulation bandwidth. It’s called modulation bandwidth because it is roughly equal to the bandwidth of the baseband video that modulates the TV channel’s visual carrier.

In contrast to CNR, baseband video SNR is a premodulation (at the transmitter or modulator) or postdetection (at the receiver) measurement – that is, one made at baseband. These days, baseband video SNR = 20log(LNOMINAL/NRMS), where LNOMINAL is the nominal luminance level, which has a value of 714 millivolts peak-to-peak (100 IRE units) for NTSC. NRMS is the RMS noise voltage, typically weighted by a specified curve. In other words, baseband video SNR is the ratio of the peak-to-peak video signal, excluding sync, to the RMS noise within that signal. The noise is measured in a bandwidth defined by a combination of low-pass, high-pass and weighting filters. More on this in a moment.

The Straus paper details five ways to characterize "SNR": NCTA; Television Allocation Study Organization; Electronic Industries Association (now called the Electronic Industries Alliance); Comité Consultatif International des Radio Communications (International Radio Consultative Committee); and Bell Telephone Laboratories.

NCTA’s SNR as discussed in the Straus paper is what we call CNR. The TASO definition for SNR is CNR, too, and is nearly the same as the NCTA’s except that the noise power bandwidth is 6 MHz rather than 4 MHz. EIA and CCIR definitions are baseband video SNR, with some notable differences between the two. EIA’s includes sync in the peak-to-peak video (1 volt or 140 IRE units) and specifies a noise weighting network. CCIR excludes sync in the peak-to-peak video (714 millivolts or 100 IRE units) and specifies its own weighting network. Finally, the BTL definition is a hybrid of sorts, in that it includes sync like the EIA definition does, but uses the CCIR weighting network. Weighting network What the heck is a weighting network, you ask?

Recall my earlier statement that the noise is measured in a bandwidth defined by a combination of low-pass, high-pass and weighting filters. The low- and high-pass filters establish the noise measurement bandwidth, cutting off low frequencies – typically those below about 10 kHz – and setting an upper frequency limit somewhere around 4 to 4.2 MHz. The weighting filter can be thought of as a bandpass filter that has a sloped frequency response that simulates the human eye’s response to noise in the TV picture. After all, why measure noise that one cannot see?

When measuring baseband video SNR, one first must ensure that the transmitter or modulator is properly set up with a 1 volt peak-to-peak input video signal – generally a reference test signal of some sort – and the visual carrier depth of modulation adjusted to 87.5 percent. The visual carrier is then demodulated at a remote test point using a precision demodulator. The demodulator’s video output is connected to specialized video test equipment and the demodulator output tweaked to ensure that it’s 1 volt peak-to-peak, with sync comprising 40 IRE units and luminance 100 IRE units. In the past, a typical test equipment configuration included outboard filters, a video noise measurement set and a waveform monitor – and the measurement was done manually. Automated test equipment has been available for several years that has selectable filtering built in, can make a variety of standards-based video measurements, and does everything pretty much at the push of a button. Which numbers? All right, let’s assume that we’ve done all of this for a TV channel under test. The CNR has been measured, and we’ve also measured baseband video SNR. This brings me back to the question posed earlier: How would the numbers compare? Here’s a summary from the Straus paper:

SNR(TASO) = CNR(NCTA) – 1.8 dB
SNR(EIA) = CNR(NCTA) + 0.1 dB
SNR(CCIR) = CNR(NCTA) – 0.2 dB
SNR(BTL) = CNR(NCTA) + 2.7 dB

Hmmm, which one is applicable? Well, the SNR(TASO) definition actually is a CNR measurement and hasn’t been used by the cable industry in decades. I’m not aware of anyone using the SNR(BTL) method, either. Besides, it includes sync in the definition of peak-to-peak video, which doesn’t jibe with the modern SNR formula that has LNOMINAL equal to 714 mV. As well, the SNR(EIA) definition includes sync. (My recollection is that a later EIA definition of SNR was changed to exclude sync, but I could be wrong about this.) What’s left is SNR(CCIR), which, by the way, is the same as baseband video SNR measurements described in NTC-7, another video measurement standard.

So, if we measure CNR on an analog NTSC TV channel and then measure weighted baseband video SNR for that same channel, the numbers should be within 0.2 dB of each other. Does this happen in the real world? You bet. Several years ago, I confirmed this in field measurements. The video source was a Tektronix 1910 NTSC digital test signal generator, the precision demodulator a Tektronix 1450-1, and various Tek video test equipment was used for the baseband SNR measurement of the demodulated test channel. A Hewlett-Packard spectrum analyzer was used for the CNR measurement. The results? The weighted baseband video SNR and the RF CNR were essentially identical.

Key assumptions in the numerical relationship between CNR and baseband SNR are that baseband video SNR measurements must be weighted measurements, and CNR measurements are in accordance with procedures detailed in NCTA Recommended Practices for Measurements on Cable Television Systems. As well, the mathematical relationships discussed here apply only to NTSC television signals. One other important point: The video test signal must be of high quality, with very high baseband SNR. In the case of the Tektronix 1910, measured baseband SNR right out of the signal generator was >70 dB.

Some of you may remember that it used to be said there was a 4 dB difference between CNR and baseband video SNR. Where did that number come from? It goes back to the SNR(BTL) definition, but is based upon an unweighted baseband video SNR measurement rather than a weighted measurement. Here’s the relationship: SNR(BTL, unweighted) = CNR(NCTA) – 4 dB.

Ron Hranac is a technical leader, Broadband Network Engineering, for Cisco Systems and senior technology editor for Communications Technology. Reach him at rhranac@aol.com.

The Daily

Subscribe

Supply Chain: Fiber Demand Skyrockets in Age of COVID

Broadband and cable operator are running into supply chain problems as they embark on construction—particularly for fiber.

Read the Full Issue
The Skinny is delivered on Tuesday and focuses on the cable profession. You'll stay in the know on the headlines, topics and special issues you value most. Sign Up