Anonymous ID: 78cb63 Jan. 26, 2019, 7:29 a.m. No.4915691   🗄️.is đź”—kun   >>5712

YouTube is making changes to its recommendation algorithm, which serves up new videos for users to watch, in an effort to crack down on the spread of conspiracy theories on its platform.

 

In a blog post on Friday, the Google-owned company said it would start reducing its recommendations of “borderline content” and videos that may misinform users in “harmful ways.”

 

“Borderline content” includes videos featuring fake miracle cures for serious diseases,claiming the earth is flat and making blatantly false claims about historic events such as 9/11, according to the company. It did not provide further examples. Such content doesn’t violate YouTube’s community guidelines, but the company says it comes close.

 

YouTube has long faced criticism for allowing misinformation, conspiracy theories and extremist views to spread on its platform, and for recommending such content to users. People who came to the site to watch videos on innocuous subjects, or to see mainstream news, have been pushed toward increasingly fringe and conspiracist content.

 

This month, a Washington Post investigation found a YouTube search for Supreme Court Justice Ruth Bader Ginsburg’s initials “RBG” turned up videos from the far-right — some of which falsely alleged doctors are keeping her alive with illegal drugs –– which outnumbered results from reliable news sources. A Buzzfeed News investigation on Thursday found that after news-related searches, YouTube suggested conspiracy videos and content from hate groups. After a mass shooting at a high school in Parkland, Florida last year, a top trending video on YouTube suggested survivor David Hogg was an actor.

 

YouTube has also faced backlash for running ads on extremist content. A CNN investigation last year found ads from over 300 companies on YouTube channels promoting white nationalists, Nazis, pedophilia, conspiracy theories and North Korean propaganda.