With the advent of digital delivery of high-speed data content, network parameters such as delay and jitter became indicators of quality of service (QoS). Now, as digital video on demand (VOD) and switched digital video (SDV) emerge as the latest version of consumer TV, quality of experience (QoE) is the newest watchword for the sum of technology and service leading to delivery of customer satisfaction in a multimedia experience.

Quality has always been a tough metric for cable to define. One of the pioneer companies in television had a slogan that pledged, "Quality goes in before the name goes on," but didn’t say exactly what that meant. Prior to two-way broadband, quality was measured, in part at least, by the lack of pins on a map showing customer outage reports. With the advent of digital delivery of high-speed data content, network parameters such as delay and jitter became indicators of quality of service (QoS). Now, as digital video on demand (VOD) and switched digital video (SDV) emerge as the latest version of consumer TV, quality of experience (QoE) is the newest watchword for the sum of technology and service leading to delivery of customer satisfaction in a multimedia experience.
   
Cable industry veteran and Symmetricom board member Jim Chiddix describes why cable operators should measure customer experience: "Degraded video quality and other performance issues are real-world obstacles that can quickly increase cable operators’ technical support costs, create customer churn, and dramatically impact revenue streams. A comprehensive solution is needed to accurately measure video quality from content origination through the network all the way to the set-top box." His company and others have approached the problem by analyzing what goes into a customer’s experience of a multimedia event, where degradations can occur, and how to apply monitoring to avoid losing quality. (See Figure 1). QoS foundation The problem is not a simple one to solve. As was the case when our industry began using QoS as a term to define grades of delivery for data and telephony services, there is no universally accepted definition for QoE. The best way to understand what it means is to look at its components, as suggested by companies involved with quality metrics and measurement. As might be expected, all of these companies agree that because QoE is a metric for digital applications, it must include factors that determine QoS.
   
QoS is an indicator of how well an underlying network transports information represented by digitally encoded packets. In a perfect world, all the packets in a data session would arrive at a receiving endpoint in the same order as they were transmitted, with all the information and overhead bits preserved as sent. In the real world, imperfections in transmission media cause pulse degradation or loss, resulting in errored bits. Similarly, switch buffer capacity limitations and routing variations can cause entire packets to be lost, and packets to arrive out of sequence because of their travel across different routes. Even when packets and bits arrive intact, long transport times can degrade content quality. QoS is ensured by tactics such as packet routing prioritization by type, buffering, and placing strict tolerances on maximum end-to-end delay and jitter (variation in travel time) for individual packets. Telephony building blocks Although QoS parameters provide an indication of how well packet integrity is preserved, they do not provide an indicator of how well the delivered content mimics a real-world experience. Loss of packets representing speech or video are more or less noticeable depending upon where they occur in a conversation or picture. This perceptual fine-tuning has driven the creation of application-specific experiential models that begin to address QoE. The idea is to provide a correlation between what a consumer expects to see and hear and what that consumer actually sees and hears. The first such measurement of QoE was the Mean Opinion Score (MOS), originally applied to voice telephony.
   
To arrive at a MOS, a tester assembles a panel of typical human beings, who rate the quality of speech and/or video samples that have been processed by the system under test. The panel rates the quality of the system under test from 1 to 5, with 1 indicating the worst quality and 5 indicating the best quality. The scores of the panelists are then averaged to arrive at a MOS. The obvious drawback of this method is the need for human evaluators for each test. To overcome this drawback, mathematical models have been built to link subjective MOSs to measurable parameters, first for speech and then for video.
   
The ITU E-model, for example, uses a set of impairment factors derived from predicted network jitter, delay, packet loss and codec performance to rate a network design on a scale of 0-100. The sum of the impairment factors is subtracted from an ideal score of 100 to arrive at an end-to-end quality index. This model was originally designed for evaluations of designs, rather than networks in service, but has since been applied to samples of live conversations. Other models have been created to test voice quality in live networks. Perceptual speech quality measurement (PSQM), perceptual analysis measurement system (PAMS), and perceptual evaluation of speech quality (PESQ) produce scores that can also be correlated to MOS. Table 1 provides the correlation typically used by test equipment vendors. Over time, this correlation has gained sufficient credence to be used as the basis for MOSs generated by algorithms in embedded multimedia terminal adapter (EMTA) chipsets. Data and video Correlation between measurable parameters and quantitative ratings is being extended to data and video applications. ARRIS uses methodology developed by its Stargus acquisition in the ServAssure advanced platform to arrive at a set of QoE metrics related to degradation over time. Test probes inserted at various points in the network measure parameters such as vendor-specific identification, multiple DOCSIS and proprietary performance metrics, and DOCSIS configuration settings. Derived data from these measurements are analyzed against a set of thresholds to determine one of three states that correlate to user experience as shown in Table 2. Measuring QoE for video is more demanding than measuring audio or even data QoE. Video packet loss or delay produces a range of degradations, depending not only upon what is represented by the lost bits, but also where they reside in compressed video frame sequences. Loss of high frequency components of a video display of a rocky landscape in an MPEG P frame, for example, may be barely discernable, while lost packets in an I frame may cause video to freeze.
   
Using multiple probes that measure different conditions at various points in the network can therefore provide better correlation between parameter measurement and viewer perceived QoE. Symmetricom applies this method in its V-FACTOR platform for cable. The Q-1000 headend video analyzer component of that platform looks at baseband source video in real time to detect transcoding or other encoding impairments. V-FACTOR combines the Q-1000 output with data from network probes, software agents, and network operations center (NOC) software to correlate quality of the baseband signal with quality of compressed video in the network and decoded video at the set-top.
   
Stefan Winkler, a principle technologist in Symmetricom’s QoE Assurance Division, explained how V-FACTOR correlation works. He said: "Impairments come from various sources, including source content, encoders, transcoders, video transmission/distribution and set-top boxes. Based on the Moving Picture Quality Metric (MPQM) model adapted specifically for end-to-end video quality scoring, our solution analyzes the actual video content and combines content impairments with network performance data in order to precisely identify subscriber quality issues. By filtering the results through a sophisticated human vision system (HVS) model, we enable cable operators to understand what quality issues matter to consumers by providing a simple, yet highly accurate video quality score."
   
The need for synchronization of the time when samples are taken at different points in the network to correlate measurements may explain Symmetricom’s new interest in QoE. The company has built its business around precision timing and was a primary author of the DOCSIS 3.0 timing specification. In the field QoE measurement is just emerging as a metric, but it is finding acceptance in a wide range of applications. Psytechnics, the holder of several of the patents on early voice quality measurement, has expanded its early work into the video space, but Benjamin Ellis, Psytechnics VP of marketing, notes that users of his company’s Experience Manager are typically enterprise customers. "Most of the use of quality of experience monitoring is for enterprise video networks in applications such as video conferencing," he said. "In these applications, avoiding digital artifacts and maintaining lip synchronization between audio and video is important to providing an experience that closely resembles a face-to-face meeting." Interestingly, this early acceptance by enterprise users parallels the way voice quality measurement gained its foothold as a tool.
   
Cable video equipment vendors are beginning to include video quality measurement in the products used for video distribution. Imagine Communications, which offers systems of broadcast processors and multiplexers, bundles its proprietary ICE-Q quality software with its product.
   
Cable operators are also beginning to address video quality measurement as a working tool. Charter VP Tom Gorman said: "We’re at the early stages now. I don’t believe we are widely using video quality measurement now, but it is a concern." Symmetricom is discussing its solution with two major operators, and ARRIS ServAssure Advanced is deployed at systems with Time Warner, Bresnan, Charter, and several international cable operators.

Justin J. Junkus is president of KnowledgeLink and telephony editor for Communications Technology. Reach him at [email protected].

The Daily

Subscribe

Effros: The Utility of Competition

the underlying theories now being bandied about for either regulating broadband internet access services (BIAS) as a utility or something that should be freely competitive are in major conflict.

Read the Full Issue
The Skinny is delivered on Tuesday and focuses on the cable profession. You'll stay in the know on the headlines, topics and special issues you value most. Sign Up

Calendar

Apr 25
2024 Cablefax 100 Awards Magazine Release: April 25, 2024
Jun 13
2024 American Broadband Congress Conference Registration is Open!
Jun 26
2024 FAXIES Awards Nominations Are Open!
Full Calendar

Jobs

Seeking an INDUSTRY JOB?
VIEW JOBS

Hiring? In conjunction with our sister brand, Cynopsis, we are offering hiring managers a deep pool of media-savvy, skilled candidates at a range of experience levels and sectors, The result will be an even more robust industry job board, to help both employers and job seekers.

Contact Rob Hudgins, [email protected], for more information.