Broadband: Making MER Better: Part 1
Modulation error ratio (MER) has been discussed at length in this column over the last few years.
It’s an important, though frequently misunderstood performance metric for the digitally modulated signals carried in our networks. MER is often reported by cable modems, embedded multimedia terminal adapters (eMTAs), set-top boxes (STBs), and even the cable modem termination system (CMTS) as "SNR," or signal-to-noise ratio, but these are in fact all MER.
Ensuring a high MER
Good engineering practice suggests that operational MER be 3 to 6 decibels (dB) or more above the MER failure threshold for the modulation type in use. Good operational minimums are ~18 dB for quadrature phase shift keying (QPSK); ~24 dB for 16-QAM (quadrature amplitude modulation); ~27 dB for 64-QAM; and ~31 dB for 256-QAM. These values are all a few dB above the respective failure thresholds, but the higher the better.
"An interesting gotcha…CTB and CSO don’t go away, they just take on a different appearance."
What can be done to ensure that MER is as high as practical for most situations?
To answer that question we first need to understand the major factors that can affect MER. Some of the more common ones include transmitted phase noise; carrier-to-noise ratio (CNR); nonlinear distortions such as composite triple beat (CTB), composite second order (CSO), cross modulation (XMOD), and common path distortion (CPD); linear distortions such as micro-reflections, amplitude ripple and tilt (basically poor frequency response), and group delay; in-channel ingress; laser clipping; data collisions; and even modulation profile settings.
Assume that transmitted phase noise isn’t a problem, modulation profiles are under control, and nothing is out of the ordinary with respect to data collisions. Let’s look at what we can do about the other factors to maximize MER, both in the forward and return path.
Carrier-to-noise ratio
First and foremost is to ensure that the headend, fiber links, and coax plant are properly aligned. As you’ll see shortly, proper alignment of all of these parts of the network impact more than just CNR.
Back in the headend, carefully check the RF levels on each channel, verifying that analog TV channel visual and aural carriers are spot on, and all QAM channels have been set to the correct digital channel power (average power) relative to analog TV channel levels. Most cable operators carry 64-QAM channels 10 dB below what an analog TV channel’s visual carrier would be on the same frequency, and 5 or 6 dB down for 256-QAM.
Don’t rely on what a spectrum analyzer display shows. The apparent height of the QAM channel haystacks will change as the analyzer’s resolution bandwidth (RBW) is varied. Instead, actually measure the digital channel power of each QAM channel, and adjust levels as needed.
Next, verify that per-channel levels are correct at the inputs to downstream lasers (check with the optical transmitter manufacturer if you’re uncertain about where to set levels), and that nodes have been set up properly. From there the coax plant’s amplifiers should be tweaked as necessary to ensure that RF input and output levels are where they should be.
The final point is the subscriber drop, where levels at the input to STBs, modems, and eMTAs should be optimized for best performance. Most cable operators like to target customer premises equipment (CPE) input levels in the -10 to +5 dBmV range on digital channels, and 0 dBmV to +10 dBmV or so on analog TV channels.
Don’t go too high, or you run the risk of overdriving the CPE! Too low, and the CNR suffers. Also beware of excessive reverse tilt — the scenario that exists when low frequency channels are hitting the CPE input at levels a lot higher than high frequency channels. Excessive reverse tilt can cause all sorts of problems with CPE performance, including degraded bit error rate (BER) and MER. Subscriber drop equalizers can fix this if it’s a problem.
Correct alignment of return path amplifiers and especially the fiber links will help keep upstream CNR in check, as will paying attention to upstream combining/splitting in the headend. For more on this, see my July 2009 column.
Nonlinear distortions
The severity of nonlinear distortions such as CTB and CSO is usually related to active device output levels. When active device outputs are too high, distortions get worse. A cable network’s operating levels are typically chosen in the design phase to keep end-of-line performance within an acceptable window that is a tradeoff between low CNR and excessive distortions. As with CNR, proper plant alignment is critical to ensure acceptable carrier-to-distortion ratios.
An interesting gotcha when a network carries a lot of digital signals: CTB and CSO don’t go away, they just take on a different appearance.
Instead of the familiar CTB clusters that fall under the visual carriers, or CSO beat clusters that can appear +/-0.75 and +/-1.25 MHz from the visual carrier frequencies, the "digital" distortions, or composite intermodulation distortions, look like wideband noise.
One might think that cranking up RF levels would help CNR — and it does to a point — but as levels get higher a strange thing happens. The noise floor starts to increase! This elevated noise floor is composite intermodulation noise, a combination of thermal noise and composite intermodulation distortions from the digital channels. All the more reason to pay attention to correct levels!
And then there is CPD, but we’ll save that for next month.
Linear distortions
Micro-reflections, which come from impedance mismatches, and the resulting amplitude and group delay ripple, mess up digital signals by causing inter-symbol interference and degraded MER. Amplitude tilt is often related to active device misalignment, and also can cause inter-symbol interference and degraded MER. Operation near band edges, particularly near diplex filter crossover and rolloff frequencies, usually means group delay and amplitude response rolloff.
What to do?
First, wherever possible, try to keep digital signals away from parts of the spectrum — say, the upper end of the return path — where group delay and response rolloff are likely to occur. When you must carry digital signals there, it probably will be necessary turn on upstream adaptive pre-equalization. Cable modems, eMTAs, and STBs already have full-time operational adaptive equalizers in their downstream QAM receiver circuits, which help to compensate for forward path in-channel response problems.
Second, sweep alignment of the forward and return is your friend when it comes to identifying and fixing most response-related impairments, and keeping the response as flat as possible. In order to see closely spaced amplitude ripple ("standing waves"), set the sweep gear to the finest resolution supported by the equipment.
Finally, use specialized test equipment such as QAM analyzers to help troubleshoot linear distortions and related problems. For more on this, see "Linear Distortions, Part 1" (www.cable360.net/ct/operations/testing/15131.html), "Linear Distortions, Part 2" (www.cable360.net/ct/operations/testing/15170.html), "Troubleshooting Digitally Modulated Signals, Part 1" (www.cable360.net/ct/operations/testing/15092.html), and "Troubleshooting Digitally Modulated Signals, Part 2" (www.cable360.net/ct/operations/testing/18539.html).
Ron Hranac is a technical leader, broadband network engineering, for Cisco Systems and senior technology editor for Communications Technology.