Thursday, Mar 28, 2024 | Last Update : 04:33 PM IST

  Technology   In Other news  29 Nov 2017  YouTube sharpens how it recommends videos despite fears of isolating users

YouTube sharpens how it recommends videos despite fears of isolating users

REUTERS
Published : Nov 29, 2017, 5:44 pm IST
Updated : Nov 29, 2017, 5:44 pm IST

The goal is to prevent the negative sentiments that can arise when people watch hours and hours of uninspired programs.

YouTube automatically recommends videos through a machine learning algorithm that analyzes the characteristics of videos and the behaviour of its 1.5 billion users to generate personalized viewing recommendations.
 YouTube automatically recommends videos through a machine learning algorithm that analyzes the characteristics of videos and the behaviour of its 1.5 billion users to generate personalized viewing recommendations.

Google’s YouTube has updated its recommendation feature to spotlight videos users are likely to find the most gratifying, brushing aside concerns that such an approach can trap people in bubbles of misinformation and like-minded opinions. The new feature, which arrived in January but has not previously been reported, uses a measure of satisfaction derived from a massive and ongoing user survey to predict and promote videos that people would rank as among the best they have watched recently.

The goal is to prevent the negative sentiments that can arise when people watch hours and hours of uninspired programs, said Jim McFadden and Cristos Goodrow, who work on recommendation technology at YouTube, which is part of Alphabet Inc. But the change comes at a time when YouTube and other social media firms are facing heavy criticism from advertisers, regulators and advocacy groups for failing to police content and account for the way their services shape public opinion.

Russian agents exploited the recommendation systems of Facebook Inc, Twitter Inc and YouTube to popularize propaganda and fake news during the 2016 US presidential election. The companies responded with increased user verification and fact-checking tools, but their recommendations remain focused on winning the attention and boosting the enjoyment of users.

“The risk is not that we are just siloing ourselves, but were able to also reinforce pre-existing, flawed viewpoints,” said Jacob Groshek, a Boston University associate professor who researches the influence of social media and ‘filter bubbles.’

YouTube automatically recommends videos through a machine learning algorithm that analyzes the characteristics of videos and the behaviour of its 1.5 billion users to generate personalized viewing recommendations. These recommendations, which appear on YouTube’s homepage and alongside clips, have become a centrepiece of the service, encouraging people to watch videos that are similar to ones they have spent significant time viewing in the past. Recommendations now drive 70 per cent of overall 'watch time' on YouTube, compared with 40 percent in early 2014, the company said.

The more time people spend watching, the more ad slots YouTube can sell. Sales of YouTube commercials are among Google’s top growth areas. But by last year, YouTube’s prediction tool had matured, said McFadden, a software engineer at YouTube since 2011. He said the idea of pinpointing “satisfaction” came after he had watched 'particularly good' videos, including a commencement speech by the late author David Foster Wallace.

“You listen to it and say this was really good,” McFadden said. But “there’s nothing really in our data about how much I like this.”

He worried that too many people felt their hours each night watching sports highlights, comedy clips and makeup tutorials were a waste. Now YouTube is gauging satisfaction by surveying nearly 10 percent of users about which videos they enjoy. One version of the survey asks whether a video watched in the last week was ‘one of the best,’ ‘great,’ ‘about average,’ ‘poor’ or ‘one of the worst.’

The feedback is a fresh data point in the recommendation algorithm. Lesser emphasis is now put on actions that may be a proxy for enjoyment but is used with varying intent, such as “thumbs up” and “thumbs down” ratings.

YouTube executives acknowledge that the approach can help misinformation spread. A user who says a video describing the moon landing as a hoax was among the best they watched in the last week would cause that video, and similar ones, to be recommended more widely.

“We would love it if the satisfaction mechanism pushed down videos about ‘we never landed on the moon’ but people will report satisfaction on quite a variety of things,” Goodrow, vice president of engineering at YouTube, said in an interview.

The company releases neither recommendation nor satisfaction data about individual videos. Johanna Wright, vice president of product management at YouTube, said in an interview that the company is taking steps to combat misinformation, including giving greater prominence to well-known media organizations in search results on trending topics.

Next year, YouTube is planning to have a similar initiative around science videos to surface “the established belief on the topic” on science videos, she said.

Still, YouTube’s chief goal is to maximize viewing time. Alphabet Executive Chairman Eric Schmidt said recently that there was little the company could do absent a bigger societal change. The problem of filter bubbles will persist, Schmidt told an international security conference on November 18, 'until we decide collectively' that users should see content from 'someone not like you.' Critics reject the notion that YouTube is powerless.

Guillaume Chaslot, a member of its recommendations engineering team who left Google in 2013 and is working to launch a nonprofit group to investigate social media algorithms, said YouTube could experiment more or release data about recommendations to researchers. He worries though that YouTube will not act until public outcry grows severe or existing tactics impair watch time. “Users are not asking YouTube to optimize for truth,” Charlot said.

Tags: youtube, social media, google, videos