March 1, 2009
Broadband: Report Card: Making the HDTV Plunge
By Ron Hranac
After putting it off for several years, I finally joined the slightly more than one-third of Americans who have already done so, and got a high definition TV (HDTV) set for home use.
I had all sorts of excuses for putting this off, among them cost, a 15+ year-old conventional TV set that still works fine, a wife who said the 15+ year-old TV set works just fine (see a trend here?), picture quality with standard definition (SD) sources, rear projection vs. liquid crystal display (LCD) vs. plasma, and even picture quality with HD sources.
During the past few years, the cost of HD sets has come down dramatically. Technology has improved to the point where both SD and HD picture quality are quite good on many of the new sets, though there are still some HD sets that do an abysmal job displaying SD material. The latter is important because, quite frankly, a lot of video content is SD and will be for the foreseeable future.
Then and now
My first experience with HDTV goes back to a 1980s National Association of Broadcasters convention. The concept of high-def had been introduced as a way to provide widescreen TV viewing rivaling projected 35 mm motion pictures, accompanied by CD-quality audio.
One of the hopes was that a global HDTV standard could be developed, avoiding the NTSC/PAL/SECAM format battles that have long plagued broadcast TV - OK, maybe that was wishful thinking.
In those days, the available HD technology was analog, and a method of actually transmitting the 30+ MHz bandwidth baseband video HD signal to viewers hadn't yet been sorted out. I remember seeing a projected 100-inch or thereabouts HD demonstration at that NAB convention and was blown away by the theater-like quality.
Fast-forward to today, where over-the-air broadcasters, cable companies, satellite providers, and some telcos include a variety of HD content in their programming lineups - but it's digital, not the '80s analog technology. HDTV's 16:9 aspect ratio gives us the widescreen experience, in contrast to standard def's 4:3 aspect ratio. The aspect ratio describes a display's or picture's shape - for instance, 16 units wide by 9 units high. High-def supports increased picture resolution, too. A theater-like experience has indeed been brought into the family rooms of millions of households.
I had been following HDTV equipment reviews in the consumer press and online for a few years, lamenting the often less-than-stellar performance of many sets on the market. As the technology got better and the cost dropped, I took another look. After convincing the "household finance department" that it might be time to get a new HDTV and move the old set to the basement - and conceding to her, um, suggestions that there would be no way one of those "big, ugly rear projection" HD displays would be allowed on the property - the choice was narrowed down to a handful of flatscreen LCD and plasma sets.
We ended up with a 1080P plasma set, and I sprung for an optional professional calibration. The latter was done by an Imaging Science Foundation-certified technician, who spent the better part of two hours tweaking the set's hidden factory setup and alignment menus. He used some fairly specialized test equipment to measure the display's color temperature, luminance levels, and so forth, while making careful adjustments.
The result was an improvement in picture quality and color accuracy. These professional calibrations have a side benefit: Energy consumption is reduced by anywhere from 10 to 40 percent, depending on display type and original settings.
Oh, I also got a Blu-ray player. Was that a good idea? Yes and no. The "yes" has to do with the absolutely stunning picture quality from Blu-ray disks, which are in 1080P format. The "no" is more general in nature and means we as an industry have a new picture quality standard against which our subscribers can compare the HD programming they're getting via our networks. And I'm afraid that I'm guilty of making those comparisons, too. More on that in a moment.
I picked up an HDTV set-top box from Comcast's local office, a relatively simple task that involved little more than taking in my old digital set-top and swapping it for a new HD box. The hardest part of installing the new set-top was waiting for its initialization, and even that happened fairly quickly.
As of late 2008, Comcast's Denver system carried about 30 HD channels, including the local over-the-air digital lineup, a couple HD video on demand (VOD) channels, some premiums such as HBO, and several of the more popular cable channels. Channels I'd like to see in HD that aren't yet available locally include some of my personal favorites: Discovery, History Channel, History International, Speed and Fox News Channel.
Was it worth it? I'd have to say yes. True HD programming on an HD display very much enhances the viewing experience. Even my wife, who is not at all technical, has noticed the difference. This is perhaps best illustrated by her comment about one of Denver's longtime local TV news anchors, after first seeing that anchor in HD: "She looks like she aged 10 years!" Let's just say that HD can show every wrinkle, out-of-place hair and blemish.
From what I've seen, the best picture quality in Comcast's Denver HD lineup is on National Geographic Channel, Science Channel, most of the local broadcast stations (when they're transmitting HD content), and even though I'm not a sports fan, ESPN and Golf Channel. Most of the other HD channels such as HDNet, Universal HD, Disney, and so on also are quite good. I don't subscribe to the premium HD channels, so can't comment on them. The worst HD picture quality is a three-way tie: Food Network, HGTV and TNT.
All three of these channels appear to be upconverting SD content - at least that's my guess. The pictures remind me of looking through a wide-angle lens. The sides of the image appear to be stretched, much like what one sees when watching a 4:3 SD picture that has been expanded to fill a 16:9 HD display's wider screen. However, the middle of the picture appears more or less normal.
I asked an East Coast colleague who gets HD programming from a different cable operator to check these channels, and he confirmed what I'm seeing. So this isn't something happening in the local cable system's headend, but is likely the way encoding is being done at the source.
In my opinion, it looks awful.
What about technical issues? So far, so good. I can count on one hand the number of times minor picture glitches such as tiling have shown up on HD the past few months, and it's difficult to say if it was happening at the source or in the cable network. From time to time, I measure various drop metrics with a quadrature amplitude modulation (QAM) analyzer and have seen no problems there.
With respect to the whole HDTV experience to date, mark me down as very satisfied. Remember the "I want my MTV" slogan and song lyrics of years past? Perhaps it's time to update it to "I want my HDTV."
Ron Hranac is a technical leader, broadband network engineering, for Cisco Systems and senior technology editor for Communications Technology.