In the business world, performance metrics are the de facto thermometer used to gauge a company’s fitness. As such, cable technicians keep a sharp eye on such things as spectrum health, signal health and data health. If one of these categories shows signs of the flu, the others will suffer.
A recent Communications Technology Webcast sponsored by JDSU addressed advanced upstream troubleshooting, spawning a spate of queries from listeners during the course of that hour. The three panelists – Ron Hranac (technical leader/CMTS Business Unit at Cisco and senior tech editor here at CT), David Haigh (lead engineer, Midcontinent Communications) and Jim Walsh (product manager, JDSU) – tackled some of the most-asked-about issues surrounding how to keep the upstream in top condition.
What causes code-word errors (CWE)? How can field network technicians troubleshoot them?
Pretty much anything that impairs upstream signal transmission can cause CWE if those impairments are severe enough: low carrier-to-noise ratio, ingress, impulse noise, laser clipping, nonlinear distortions, linear distortions, etc. If you don’t see visible ingress, the problem may well be related to impulse noise (a form of ingress), laser clipping or linear distortions.
Impulse noise frequently shows up quite well in the constellation using a QAM analyzer. A packet impacted by single fast impulse noise burst generally will have just a few symbols significantly moved from their desired locations. If there are no other impairments present it is even easier to see; you will have a very tight constellation with just a few outlier symbols. You can see example constellation pics in the app note published at www.jdsu.com/product-literature/QAMTrak_Analyzer_Application_Note.pdf.
To check for impulse noise, connect a spectrum analyzer to an upstream test point in the headend or hub. Set the spectrum analyzer span control to display 1 MHz-50 MHz, and turn on peak hold. Let the analyzer sit in peak hold mode for several minutes and see if the noise floor builds up significantly higher than when the analyzer is not in peak hold (do this when the CMTS is reporting codeword errors).
You can also try placing the analyzer in zero-span mode, which displays amplitude versus time; impulse noise often can be seen more easily this way.
Can an individual modem cause code-word errors?
If that modem were transmitting at or above its rated maximum output level, the transmitted signal might become distorted. The latter might cause uncorrectable errors. A rogue modem maxed out might clip the upstream laser, affecting all traffic on that return path.
What type of impairments will affect the entire return segment and what type of impairment is only localized to a specific household?
Impairments that "funnel" back to the node will affect the entire return segment. Examples include thermal noise, such nonlinear distortions as common point distortion (CPD), impulse noise and ingress, and laser clipping. Individual drop problems that affect the upstream signal level from a given modem generally will affect only that modem (example: a backwards splitter or other drop problems), linear distortions (micro-reflection, amplitude ripple, group delay) through which certain modems’ signals pass, etc., will not affect the entire return segment unless very close to the node.
Will moving to an all-digital format make CPD go away?
No, CPD does not "go away" in an all-digital or mostly digital network. The second- and third-order distortions become noise-like rather than discrete beat clusters one sees in a network with a large number of analog TV channels. When CPD exists in a network with a lot of digital signals, the effect is an elevated noise floor. And going all digital may make CPD a lot more difficult to identify. Analog CPD is very distinctive. Digital CPD just looks like noise and could be confused with other impairments.
Under what circumstances would a technician find severe upstream data collisions, and how would you discover/troubleshoot that?
Data collisions typically occur during contention slots for modem transmissions (this is normal and expected behavior). Excessive retransmissions may be an indication of data collisions.
Regarding transmissions, is it better to have the transmit level as high as possible and close to the minimum threshold to have a better upstream CNR?
It is a good idea for modem transmit levels to be fairly high for a better carrier-to-junk ratio, but a given modem’s transmit level should have around 6 dB of headroom with respect to that modem’s maximum transmit power capability. The latter helps ensure sufficient output power dynamic range to accommodate temperature-related signal level changes in the coax plant.
What causes laser clipping, and how can it be resolved?
The most common cause of laser clipping is RF levels that are too high at the laser input.
How do I know if I have laser clipping?
To check for laser clipping, observe the spectrum from 42 MHz to about 200 MHz with your analyzer. Clipping produces distortions that are visible in this frequency range.
At exactly what frequencies does laser clipping occur?
When laser clipping occurs, it affects all frequencies in the signal path. This is known as cross-compression. A good place to look for the presence of clipping distortion (upstream) is in the 42 MHz-200 MHz band.
Which lasers are affected: those in the node or those in the hub?
Any return laser through which upstream signals are transmitted is susceptible to clipping (downstream lasers can clip, too).
From the CMTS perspective, is there any best practice that can be applied for upstream issues?
Consult with your CMTS manufacturer for recommended configurations and modulation profiles.
What is meant by unequalized vs. equalized modulation error ratio (MER)?
Equalized MER is computed in a QAM receiver (cable modem, set-top, CMTS upstream receiver, QAM analyzer, etc.) after the adaptive equalizer compensates for channel response impairments. Unequalized MER is computed before the adaptive equalizer. For more on this, see "Equalized or Unequalized? That is the Question".
What effect does combining upstream returns on a DWDM link have on upstream signal-to-noise ratio (SNR) and CWEs?
This would first depend on whether the return is analog or digital. Digital returns are usually unaffected by DWDM technology unless “normal” optical impairments are present (e.g., low light due to muxing, etc.). If not aligned correctly the amount of analog upstreams combined can affect upstream SNR (MER) or CWE.
Most CWE or MER problems that have been encountered with upstream optics are due to other such normal optical impairments as low or excessive light levels, low or excessive RF levels, overdriving an upstream erbium doped fibre amplifier (EDFA) or laser clipping. One of the major contributors to low upstream SNR (MER), are misaligned optics on the return path.
How does TI’s SNR differ from Broadcom’s SNR reporting?
Texas Instruments burst receivers report unequalized MER, while Broadcom burst receivers report equalized MER.
Given that MER is extremely accurate, is there a limit to a field meter’s (specifically a DSAM 6000) accuracy in regards to MER? Are the rumors true that the DSAM is inaccurate when reading MER at higher (or lower) error ratio levels?
The article "Is MER Overrated?" may answer your question from a general perspective as to why one often sees differences in reported MER among various makes/models of test equipment. All of JDSU’s DSAM meter’s have an MER accuracy specification of 35 dB +/- 2 dB (typical), with an input level between -5 dBmV and +50 dBmV. Although the DSAM meters can display MER readings up to 40 dB (or slightly higher), the MER accuracy is not guaranteed. But that doesn’t necessarily mean that they are “inaccurate.”
How accurate is the Broadcom SNR measure compared to a spectrum analyzer?
A CMTS burst receiver chip’s reported "upstream SNR" actually is MER. It is NOT the same thing as CNR that you would measure with a spectrum analyzer. CNR does affect the reported MER, but many other factors do, too.
Which equipment can help technicians see linear distortions?
An upstream-capable QAM analyzer or similar digital signal analyzer is useful for determining the presence of linear distortions. The screenshots used during the CT/JDSU Webcast were from JDSU’s PathTrak WebView 2.5, scheduled for release later this summer.
Is measuring the micro-reflection on a CMTS one way of locating linear distortion?
When "upstream SNR" (MER) is degraded but CNR is good, that may be an indication of the presence of linear distortions. Specialized test equipment is the best way to characterize the severity of linear distortions.
What is most common outside plant cause of group delay? Is it diplex filters in actives?
Group delay in the outside plant is problematic at the return path band edges (5 MHz~10 MHz, caused by AC chokes/filters, and greater than about 35 MHz, caused by diplex filters). When a micro-reflection creates amplitude ripple ("standing waves"), one is likely to see group delay ripple, too. The testing Midcontinent has performed on live plant pointed to a diplexer each time. Each time it tested a cable modem following an amplifiers diplexer, the group delay increased until the test reached the fourth and fifth amplifier, where the cable modem struggled to operate.
What are some options for performing reverse sweep in a fully loaded upstream?:
Your sweep equipment manufacturer can provide recommendations about sweep equipment configuration for use in a congested upstream spectrum. Many operators leave small gaps between each upstream to enable a sweep point to be inserted. You NEVER want to insert sweep pulses and/or sweep telemetry carriers anywhere within the occupied bandwidth of any upstream carrier! One vendor recommends that cable operators allow a 500 kHz guard-band between carriers for sweep insertion points if possible in order to avoid interfering with upstream carriers.
At worst case, the guard band can be a minimum of 100 kHz; however, whenever inserting sweep pulses within 100 kHz guard bands, it is recommended that the sweep pulses and telemetry insertion levels are reduced by approximately 6 dB to 10 dB to avoid any interference with upstream carriers.
Listen to the entire “Advanced Upstream Troubleshooting” Webcast.