(Editor’s note: Last month, Ron began the “digital proofs” conversation and this month, he finishes.)
5. CTB: Not worse than -53 dBc referenced to inband carrier levels for analog channels.
6. CSO: Not worse than -53 dBc referenced to inband carrier levels for analog channels.
Measuring composite triple beat (CTB) and composite second order (CSO) in a QAM channel can be done one of two ways. The first, which is service-disruptive, requires that someone turn off the QAM channel under test back at the headend while a technician in the field measures CTB and CSO beat cluster levels in the now-empty channel slot the same way as is done to comply with 76.605(a)(8)(i) and (ii) of the FCC rules for CTB/CSO measurements in analog-TV channels. Note that SCTE-40’s spec for CTB/CSO is -53 dBc compared with the -51 dBc in the FCC rules.
A second method is a non-disruptive measurement using test equipment that can display the noise floor beneath an active QAM signal. At least two test equipment manufacturers’ products support the second method. In either case, SCTE-40’s CTB/CSO ratio measurements are relative to analog-TV channel levels.
In a network with a lot of analog-TV channels, CTB beat clusters will appear on visual carrier frequencies, and CSO will appear ±0.75 MHz and ±1.25 MHz relative to visual carrier frequencies. As the number of QAM signals increases, CTB and CSO take on a noise-like appearance called composite intermodulation noise (CIN). Thermal noise and CIN combine to create composite noise, and the only way to differentiate the two is to turn off all downstream signals to see the remaining thermal noise. When the downstream payload is turned back on, the noise floor likely will go up. The additional “noise” is the CIN.
7. Carrier-to-any other discrete interference (ingress): Not worse than -53 dBc
One can measure this SCTE-40 parameter using either of the two previously discussed methods for CTB and CSO. What’s unclear is the reference for the -53 dBc spec, although my assumption is that it’s supposed to be the same reference that’s used for CTB/CSO in items #5 and #6.
8. AM Hum Modulation: Not greater than 3-percent p-p
While AM hum modulation can be measured using a continuous wave (CW) carrier, some QAM analyzers support automatic measurement of hum in an active QAM signal.
9. Group Delay Variation: ≤0.25 µsec/MHz across the 6-MHz channel
A QAM signal’s in-channel group delay is most easily measured using a QAM analyzer that supports this function. The group delay is derived from the test equipment’s QAM receiver adaptive equalizer coefficients. The SCTE-40 spec for this parameter, ≤0.25 microsecond, is the same as ≤250 nanoseconds.
10. Chroma/Luma Delay: ≤170 ns (AM-VSB analog)
This is a parameter for analog TV channels that is required to be measured in accordance with §76.605(a)(11)(i) as part of the so-called analog video color tests.
11. Phase Noise: <-88 dBc/Hz @ 10 kHz offset (relative to the center of QAM signal spectrum)
As mentioned in Part 1 last month, the FCC in §76.640(b)(1)(i) relaxed this requirement from what is specified in SCTE-40 to -86 dB/Hz. Phase noise measurement is a service-disruptive measurement that requires removing a QAM signal’s modulation, leaving just a CW carrier on the channel’s center frequency. It also requires a lab-grade spectrum analyzer that has phase noise performance much better than that of the QAM modulator being measured. The typical cable-TV-quality spectrum analyzer won’t work here. A detailed procedure for phase noise measurement can be found in NCTA Recommended Practices for Measurements on Cable Television Systems, 3rd Ed. (“Section 3.7 Phase Noise”), now available from SCTE.
12. Maximum amplitude variation across the 6-MHz channel (digital channels): ≤5 dB p-p;
Maximum amplitude variation across the 6-MHz channel (analog channels):?4 dB p-p
The first part of this requirement applies to downstream QAM signals. The second part applies to analog-TV channels and largely is the same as what is specified in §76.605(a)(6) of the FCC rules. The small box ( ?) in front of “4 dB p-p” is as it appears in SCTE-40 and is probably a typo or a glitch. While I could be wrong, I suspect the box is supposed to be the graphic symbol for less than or equal to (≤).
One can determine the approximate in-channel flatness of a QAM signal by observing the flatness of the top of the “haystack” on a spectrum analyzer. A more accurate method is to use test equipment that supports an automated in-channel flatness measurement, in which the response typically is derived from the test equipment’s QAM receiver adaptive equalizer coefficients.
13. Micro-reflections bound for dominant echo: -10 dB at ≤0.5 µs; -15 dB at ≤1.0 µs; -20 dB at ≤1.5 µs; -30 dB at ≤4.5 µs; Micro-reflections longer than 4.5 microseconds rarely occur in conventional cable television systems. Moreover very low-level micro-reflections ( e.g. , -40 dB) longer than 4.5 microseconds may not be measured reliably. Therefore, micro-reflections longer than 4.5 microseconds shall be considered under item 4 (of this table) as a contributor to C/(N+I).
A QAM analyzer’s adaptive equalizer graph (“equalizer stress”) can be used to characterize micro-reflections. An equalizer graph’s vertical axis usually shows relative amplitude in decibels, and the horizontal axis shows time, typically µs. The adaptive equalizer’s DFE taps (the vertical bars to the right of the tallest, or main, tap) should be below the thresholds listed in SCTE-40.
14. Burst noise: Not longer than 25 µs at 10-Hz repetition rate
Section 3.6 of NCTA Recommended Practices for Measurements on Cable Television Systems, 3rd Ed., describes three methods for measuring upstream impulse noise. Method B or C should be able to be adapted to downstream measurement of burst (impulse) noise. Note that SCTE-40 does not include an amplitude reference for burst noise, per Table B’s Note 5: “Burst noise is statistical in nature and a reference level should be defined. Studies on this are continuing.”
15. Carrier level at the terminal input: 64-QAM: -15 dBmV to +15 dBmV; 256-QAM: -12 dBmV to +15 dBmV; Analog Visual Carrier (c): 0 dBmV to +15 dBmV; Analog Aural Carrier: -10 dBc to -17 dBc
The first two parameters of this requirement apply to downstream QAM signals, and the second two apply to analog-TV channels. A properly calibrated SLM, QAM analyzer with SLM functionality or spectrum analyzer can be used to measure analog-TV channel signal levels.
Most modern SLMs, QAM analyzers and some spectrum analyzers support QAM signal digital channel power measurement, which removes the potential for error when performing a manual measurement that otherwise requires correction factors.
Some Parting Thoughts
§76.640 has been on the books for several years. In 750 MHz-and-greater plants, downstream digital signals must comply with SCTE-40. §76.640 doesn’t specify how often or when to measure QAM signal performance, how many channels to test or how to make specific measurements.
My suggestion: Carefully read all of §76.640 and SCTE-40 and adopt the necessary procedures to ensure that you comply with what’s stated. Note that §76.640 refers to the 2003 version of SCTE-40. The latter has since been updated, although the FCC rules do not reference later versions.
When it comes to using test equipment to measure QAM signal performance, the usual caveats apply. Follow the test equipment manufacturers’ recommendations for appropriate factory calibration requirements, warm-up time and field calibration immediately prior to performing all tests. Follow the test equipment manufacturers’ instructions for specific measurements. Actual setup and measurement procedures will vary among different makes/models of test equipment.
Refer to NCTA Recommended Practices for Measurements on Cable Television Systems, 3rd Ed. for detailed how-to descriptions of several digital measurement procedures.
And be sure to document everything.
Ron Hranac is technical leader/broadband network engineering for Cisco Systems and CT’s senior technology editor. Contact him at email@example.com.