Tough questions are being asked about the role of social media in the wake of the horrific shooting that took the lives of at least 49 people at two New Zealand mosques. Sadly, tough questions with no easy answers.
The 28-year-old alleged white supremacist gunman not only livestreamed the rampage via helmet-cam on Facebook and Twitter, but footage of the massacre circulated even hours after the shooting, despite the frantic efforts by Facebook, YouTube, Twitter and Reddit to take it down as quickly as possible, each of which issued the requisite statements condemning the terror, and each of which have codes of conduct that are sometimes violated.
Ahead of the attack, the shooter posted a since removed hateful 74-page manifesto on Twitter.
And during the killing, he apparently referenced divisive YouTube star PewDiePie, who for the record subsequently tweeted, “I feel absolutely sickened having my name uttered by this person.”
“The attack on New Zealand Muslims today is a shocking and disgraceful act of terror,” said David Ibsen, executive director of the non-profit, non-partisan Counter Extremism Project (CEP) global policy organization. “Once again, it has been committed by an extremist aided, abetted and coaxed into action by content on social media. This poses once more the question of online radicalization.”
Mia Garlick from Facebook New Zealand issued a statement Friday, indicating that, “since the attack happened, teams from across Facebook have been working around the clock to respond to reports and block content, proactively identify content which violates our standards and to support first responders and law enforcement. We are adding each video we find to an internal data base which enables us to detect and automatically remove copies of the videos when uploaded again. We urge people to report all instances to us so our systems can block the video from being shared again.”