During SCTE’s 2011 Cable-Tec Expo technical workshops, Broadcom’s Bruce Currivan and I teamed for an encore to our 2007 presentation on modulation error ratio (MER). Our 2007 Expo paper, Digital Transmission: Carrier-to-Noise, Signal-to-Noise, and Modulation Error Ratio, did a deep dive into the “what and why” behind MER. That earlier paper also appeared as a two-part article in the June 2007 and July 2007 issues of Communications Technology (click here and click here to read those stories).
This time around, we took a look at the practical side of MER measurements. We got together with Cisco’s Chris Leghorn to do a variety of lab and field measurements in mid-2011, and then we presented the results during last November’s Cable-Tec Expo. If you couldn’t make it to Atlanta, the following covers some of the highlights of our lab and field tests along with our recommendations. The full technical workshop paper and slides are included on the 2011 SCTE Expo Proceedings CD-ROM.
As the cable industry continues to expand digital signal carriage — in many instances migrating to mostly- or all-digital operation — characterization of those digital signals becomes ever more important. Various metrics are used to determine the health of digital signals and the networks that carry them, such as digital channel power, carrier-to-noise ratio (CNR), carrier-to-noise-plus-interference ratio (CNIR), carrier-to-composite noise ratio (CCNR), in-channel flatness and group delay, adaptive equalizer performance, bit error ratio (BER), correctable versus uncorrectable codeword errors and MER.
The subject of MER measurement has caused a lot of confusion, in part because of a lack of understanding about just what MER is and why different instruments may report different values on the same signal under identical conditions (spoiler alert: we found the latter to definitely be the case). Most cable operators now mandate minimum end-of-line MER performance on downstream quadrature amplitude modulation (QAM) signals, and upstream MER or “SNR” at the cable modem termination system (CMTS) input, often without specifying whether the measurement is equalized or unequalized, the make/model of test equipment used, or even such basic measurement conditions as test equipment per-channel RF input signal level. We found it is critical to specify all of the latter; more on this later.
The subject of MER measurement has caused a lot of confusion, in part because of a lack of understanding about just what MER is.
MER is digital complex baseband signal-to-noise ratio (SNR) and is the ratio, in decibels, of average symbol power to average error power. MER is, in effect, a measure of the “fuzziness” or spreading of a constellation’s clouds of plotted symbol points. The wider the spread and/or the more outlying points in the clouds, the lower the MER. Higher MER means a cleaner signal.
MER is affected by pretty much everything in a QAM signal’s transmission path: transmitter and receiver phase noise; CNR; nonlinear distortions (composite triple beat, composite second order, cross modulation, common path distortion); linear distortions (micro-reflections, amplitude ripple, group delay); in-channel ingress; laser clipping; data collisions; and even suboptimal modulation profiles. Some of these can be controlled fairly well but, no matter what one does, a QAM signal is going to be impaired as it makes its way through a cable network. The worse the impairments, the lower the reported value of MER.
Transmission impairments aside, how MER measurement functionality is implemented in a QAM receiver (test equipment, set-top, modem, etc.), and the conditions under which MER is measured also have a major impact on the reported value.
To sort all of this out, we started with a series of lab tests in which several different instruments were used to measure MER on a given signal under a variety of operating conditions. The intent was to determine the extent of differences in reported MER values that might exist among the instruments when measuring the same signal — Channel 71 (507 MHz center frequency), 256-QAM in the case of our tests — for each measurement condition.
The lab reference equipment was a Rohde & Schwarz EFA. Several makes/models of cable-TV QAM analyzers also were used; the latter included a first-generation QAM analyzer and a number of newer instruments in widespread use by the cable industry. In most of the lab test configurations, MER was measured at three RF input levels: -10 dBmV, 0 dBmV and +10 dBmV.
We intentionally did not identify the field instruments in our paper or presentation so that the testing wouldn’t come across as a product comparison in an attempt to identify a so-called “best performing” QAM analyzer. Defining what is meant by “best” is something for test-equipment marketing personnel and cable operators to determine. In general, however, it was found that later analyzer models outperform earlier models, regardless of manufacturer.
In the lab, we set up a 154-channel all-digital headend, optical fiber link and three-amplifier cascade with taps, then proceeded to measure MER on the test channel at various points in the lashup. Here’s are the various lab configurations:
• Test configuration 1: QAM modulator direct output, single channel
• Test configuration 2: Headend combiner output, 154 active QAM channels
• Test configuration 3: Headend combiner output, 152 active QAM channels (lower and upper adjacent channels turned off)
• Test configuration 4: Optical receiver output, 154 active QAM channels
• Test configuration 5: Optical receiver output, 152 active QAM channels (lower and upper adjacent channels turned off)
• Test configuration 6: Optical receiver + 3 amplifiers, 154 active QAM channels
• Test configuration 7: Optical receiver + 3 amplifiers, 152 active QAM channels (lower and upper adjacent channels turned off), injected broadband noise for nominal 35 dB CNR (measured ACP with R&S EFA: -35.2 dBc)
Test configuration 8: Optical receiver + 3 amplifiers, micro-reflection, measured on Channel 2 (57 MHz center frequency), approximately 35.7 dB CNR
The Real World
Lab tests are one thing, but what about a real-world cable network with connected subscribers and the usual gremlins that exist in day-to-day operation?
The cable plant used for the field tests is a cable system in a major metropolitan area. The architecture comprises a two-way, 750 MHz HFC network carrying a mix of analog TV channels and QAM signals. A 256-QAM signal on Channel 71 was used for all field measurements. To help ensure consistency, a variable attenuator was connected between each test point and the test equipment to achieve a nominal 0 dBmV input signal level on the channel being measured. Here’s a summary of the field measurement locations:
• Test point 1: Headend/hub site, downstream laser input
• Test point 2: Node downstream output
• Test point 3: Node + 6 amplifiers, 11 dB 4-port end-of-line tap
Next month, I’ll highlight the interesting and sometimes surprising lab and field test results; share some guidelines to achieve consistent, meaningful MER measurements; and offer a few tips on how to improve MER in an operating cable network.
Ron Hranac is technical leader, broadband network engineering at Cisco Systems and senior technology editor for Communications Technology. Contact him at firstname.lastname@example.org.