Internet, and especially social networks, were designed for sharing. A week ago, Zuckerberg promised changes that could reduce the virality of certain publications.
When in September 2017 Mark Zuckerberg had to give answers about his company’s role in Russia’s attempts to manipulate U.S. presidential elections, Facebook’s CEO said in a post on his social network: “We don’t verify what people say before they say it, and frankly, I don’t think society wants us to. Freedom means that you don’t have to ask for permission first, and you can say whatever you want,” Infobae reported.
This Friday, after an armed man used Facebook’s streaming service to broadcast live the massacre of dozens of people in a mosque in New Zealand, Zuckerberg’s words take on new meaning.
And is that, regardless of their statements, and the responses that Twitter, YouTube, Reddit and other social networks give about the role they played in the dissemination of images and videos of the shooting in Christchurch, what should be kept in mind is that “the networks worked for what they are designed to do: allow humans to share what they want, when they want, with as many people as they want,” said journalist Peter Kafka in an article published by Recode.
“Of course, Facebook doesn’t want murderers to broadcast their crimes around the world. But social networks are a tool that allows you to do exactly that. A tool that is on a platform that is fundamentally built to allow people to say what they want, without asking permission first,” Kafka added.
And that’s exactly the key to Facebook’s fabulous success as a company: users supply the content and the software designed by Facebook distributes it around the world, instantly. More than a billion people – users and advertisers – upload what they want on that social network, without human intervention. And the fact that Facebook doesn’t check comments, ads or (almost) anything else before it’s published is also what gives it great legal protection, especially in the United States: if there’s something offensive or illegal on Facebook, it’s not because Facebook put it there, but because someone put it on Facebook.
Indeed, this is the model of all the platforms of the giants that have left Silicon Valley in the last decade: YouTube and Twitter don’t close their comments or videos before uploading them, and Airbnb doesn’t check before.
In his 2017 post, Zuckerberg explained that questionable content would be removed after it was uploaded, which happened with the shooter’s account shortly after the live broadcast. The company says it will invest billions of dollars in a combination of software and humans to combat abuses in the future.
Last week, also posted on its social network, Zuckerberg announced that it will shift Facebook’s focus toward more personal and encrypted communication. Less “public” and more “privacy-focused” will be the future of its social network, the CEO explained. However, even after the changes, Facebook will still allow the New Zealand shooter to do the same thing he did yesterday.
It is possible that the changes Facebook will implement will reduce the virality of the images, but they could not prevent them from being uploaded to the platform. It may also be much more difficult for Facebook to control because in last week’s announcement, Zuckerberg explained that the company plans to provide full encryption for messages (“end-to-end encryption prevents anyone, including us, from seeing what people share in our services”). If you don’t “look” at the content of messages, you’re unlikely to be able to control it.
In his 2017 post, after explaining that he did not previously censor messages, Zuckerber said, “If you break the rules of our community or the law, then you will face the consequences. Journalist Peter Kafka added: “It’s hard to imagine what consequences Facebook can impose on a person who killed dozens of people today. And it’s hard to imagine that this won’t happen again.