This morning’s Observer column:
Zeynep Tufecki is one of the shrewdest writers on technology around. A while back, when researching an article on why (and how) Donald Trump appealed to those who supported him, she needed some direct quotes from the man himself and so turned to YouTube, which has a useful archive of videos of his campaign rallies. She then noticed something interesting. “YouTube started to recommend and ‘autoplay’ videos for me,” she wrote, “that featured white supremacist rants, Holocaust denials and other disturbing content.”
Since Tufecki was not in the habit of watching far-right fare on YouTube, she wondered if this was an exclusively rightwing phenomenon. So she created another YouTube account and started watching Hillary Clinton’s and Bernie Sanders’s campaign videos, following the accompanying links suggested by YouTube’s “recommender” algorithm. “Before long,” she reported, “I was being directed to videos of a leftish conspiratorial cast, including arguments about the existence of secret government agencies and allegations that the United States government was behind the attacks of 11 September. As with the Trump videos, YouTube was recommending content that was more and more extreme.”