Focussing on the difficulty of ‘moderating’ vile content obscures the real problem

Good OpEd piece by Charlie Warzel:

Focusing only on moderation means that Facebook, YouTube and other platforms, such as Reddit, don’t have to answer for the ways in which their platforms are meticulously engineered to encourage the creation of incendiary content, rewarding it with eyeballs, likes and, in some cases, ad dollars. Or how that reward system creates a feedback loop that slowly pushes unsuspecting users further down a rabbit hole toward extremist ideas and communities.

On Facebook or Reddit this might mean the ways in which people are encouraged to share propaganda, divisive misinformation or violent images in order to amass likes and shares. It might mean the creation of private communities in which toxic ideologies are allowed to foment, unchecked. On YouTube, the same incentives have created cottage industries of shock jocks and livestreaming communities dedicated to bigotry cloaked in amateur philosophy.

The YouTube personalities and the communities that spring up around the videos become important recruiting tools for the far-right fringes. In some cases, new features like “Super Chat,” which allows viewers to donate to YouTube personalities during livestreams, have become major fund-raising tools for the platform’s worst users — essentially acting as online telethons for white nationalists.