People who document evidence of violent extremism are being shut down in Youtube’s crackdown on violent extremism

Yesterday, Youtube announced that it would shut down, demonetize and otherwise punish channels that promoted violent extremism, “supremacy” and other forms of hateful expression; predictably enough, this crackdown has caught some of the world’s leading human rights campaigners, who publish Youtube channels full of examples of human rights abuses in order to document them and prompt the public and governments to take action.


One prominent dolphin caught in Youtube’s tuna net is Ford Fischer, editor in chief of News2share, whose work is widely used by governments, prosecutors, documentarians, and mainstream news stories. Fischer’s videos are some of the best-documented instances of Holocaust denial, incitements to violence, and other extremist acts by far-right figures.

Youtube’s letters to Fischer say that his videos were “carefully reviewed” by a “team of policy specialists” and found to be in violation of Youtube’s policies. All his videos have been demonetized and some have been taken down, including a video that showed antifa protesters arguing with a Holocaust denier. The other video was a speech given by the American Nazi Michael “Enoch” Peinovich, which has been used documentaries from PBS and the New York Times.


Youtube isn’t the only outlet that has taken down Fischer’s videos: Facebook has also removed his footage in its sweep of “extremist” content.


Fischer links his woes to Youtube’s bungling of the removal of the racist homophobe Stephen Crowder, which he says has prompted the company to play a game of both-sidesism, setting up a “false equivalence” of racists and people who oppose racists and document their racism.


Some timely reading: Caught in the Net: The Impact of “Extremist” Speech Regulations on Human Rights Content, a report by the Electronic Frontier Foundation’s Jillian C York: “The examples highlighted in this document show that casting a wide net into the Internet with faulty automated moderation technology not only captures content deemed extremist, but also inadvertently captures useful content like human rights documentation, thus shrinking the democratic sphere. No proponent of automated content moderation has provided a satisfactory solution to this problem.”


(via Naked Capitalism)

from Boing Boing https://boingboing.net/2019/06/06/dolphins-in-tuna-nets.html

Leave a Reply