Artificial Intelligence: From Self-Improving Search to Voice Recognition
Artificial intelligence is often associated with self-driving cars or robots. It’s also feared by some who believe humans will eventually be replaced by machines. But artificial intelligence isn’t robotics—it’s a technology used to help society achieve goals, according to Google Public Policy and Government Relations Counsel Tim Hwang. From better allocating patients to resources within the health care system to improving google searches, AI permeates all aspects of our lives. The Technology Policy Institute’s “Artificial Intelligence: The Economic and Policy Implications” event on September 12 at the National Press Club in Washington, D.C. further clarified and discussed the impact of artificial intelligence.
One aspect of AI takes data via algorithms and determines patterns. It’s improving search engines by making it more personalized for users. For example, the Google search “cats playing the piano” presents the results by relevance. As more and more people search for cats playing the piano, the more accurate the search becomes. Cat begins to prompt “piano.” Piano begins to prompt “cat.” This self-improving cycle uses data to help achieve the user’s goal of watching a video of a cat playing the piano. Artificial intelligence is “enabling the next generation of search,” said Hwang on a panel. The next generation of search will occur faster than we think. “The pace of change is really remarkable,” Hwang said.
Meanwhile, the pay-TV industry has already embraced voice search capabilities, and their prevalence is rising. Comcast launched a voice remote in May 2015 and DISH followed this past July. As voice-enabled search continues to spread, the customer experience will improve. And in today’s competitive landscape that is crucial. According to Digitalsmiths’ Q2 2016 Video Trends Report, 65.9% of respondents get frustrated ‘always’ or ‘sometimes’ when trying to find something to watch on TV, an increase of 2.9% from last quarter. Additionally, 35.9% of respondents use a third-party service, such as Rotten Tomatoes, IMDb and Moviefone, to help them find something to watch. Another area to for pay-TV providers to focus on is with content recommendations. In the Digitalsmiths survey of those respondents who do not have access to content recommendations, 47.2% would like to receive this function, an 11.5% increase over three years.
And on Tuesday Food Network announced it has launched a skill, or search capability, on Amazon voice-enabled devices. Users of Amazon Echo, Echo Dot, Amazon Tap and Amazon Fire TV can enable the Food Network’s skill to gain access to show times and information, recipes and more. Consumers can ask questions like “Alexa, what’s on Food Network right now?” or “Alexa, send me the recipes from the show I’m watching right now.” “Voice is emerging as a way for consumers to connect and interact with brands in a way that provides value in their lives, and this launch is a first step toward more utility-based services down the road as we integrate with Alexa,” said Liesel Kipp, vice president of product management for Scripps Networks Interactive, parent company of Food Network. A growing number of features are being added by third parties through Alexa skills, numbering more than 3,000 to date. Food Network’s involvement is another example of AI supporting improvements in search capabilities to enhance the user experience. Expect to see plenty more in our AI-enabled future.