Social networks such as YouTube and Facebook have the power to make content go "viral," spreading it at an unprecedented and uncontrollable pace. That seems to be enough when you're looking at a cat video, but if it's murder, for example, the way of stopping the virus becomes glaring.
After the New Zealand mosque shootings were streamed live in March, attempts to remove the video from online platforms were shown to be hopeless. Facebook took it down 1.5 million times and it still reappeared. And it was back up on YouTube every second for 24 hours, according to New Zealand's prime minister Jacinda Ardern, who's teaming up with French president Emmanuel Macron to try to tackle the plague of harmful online content.
The two leaders' initiative , known as the "Christchurch Call," is the start of a long process that will need to balance all kinds of issues – from freedom of speech to privacy. But for it to have any effect at all, policy makers must get under the hood of the social media companies' software and understand how content gets fed to viewers in the first place. While Macron has taken the encouraging first step of embedding the government officials in Facebook's offices to monitor how it polices content, it's yet to be seen how these people will grasp the technology or how much their access will be to the company's real "secret sauce" : Its algorithms.
This is definitely an interesting change of direction, though. Much of the debate has been about how to get the tech giants to better regulate themselves. That has included deploying more human moderators to police content, beefing up artificial intelligence tools to stop bad stuff spreading, and "de-platforming" the worst online offenders. But it's been like fighting a forest fire with a water gun. The heart of the problem lies with those dopamine-feeding algorithms.
Alphabet Inc.'s Google and Facebook Inc. are in the business of advertising, and their ability to grab the attention of users who make them so powerful and "sticky." For all the negative headlines about data breaches and toxic content over the past year, Facebook and YouTube put both reach about 2 billion users a month. That's because they're very good at three things: knowing what users want, serving it automatically, and encouraging a feedback loop or engagement. This is all thanks to the constant fine tuning and updating of the algorithms that dictate content filtering, promotion and recommendations. The aim is to keep people on the website for as long as possible with minimal effort. It works: The human mind is no match for a supercomputer.
Where it all gets complicated is the potential business conflict between maximizing engagement, which is what advertisers (and shareholders) want, and minimizing extreme content, which is what politicians are demanding. For instance, if the network's goal is to increase time spent watching videos, is the problem that its algorithms will always promote content that is most likely to trigger an emotional reaction? If so, that's catnip for advertisers, but also for purveyors of noxious content.
And just how much control do these companies have over what their black-box filters pump out? This is a question being asked by Ardern. The Christchurch killer's video was designed to go viral, with all of the emotional force being so effective with the algorithms, and the social networks that unable to stop it.
Facebook's founder Mark Zuckerberg has made much of his desire to be regulated ”To try to fix the problem of extreme content and get news, but it's too early to say whether initiatives like Macron will really be given the keys to the kingdom. Bruno Patino, dean of Sciences Po's journalism school, makes some useful new suggestions in his new book, "The Goldfish Civilization": Greater transparency on how the algorithms function, more ethical approach to how they are designed in the first place, and a much clearer divide between advertising and content. If that leaves new media looking more like old media – and with a potential valuation discount to boot – it seems like a worthwhile price to pay.
To contact the author of this story: Lionel Laurent at firstname.lastname@example.org  Contact box editor responsible for this story: James Boxell at email@example.com
This column does not necessarily reflect the opinion of the editorial board or Bloomberg LP and its owners.
Lionel Laurent is a Bloomberg Opinion columnist covering Brussels. He previously worked at Reuters and Forbes
© 2019 Bloomberg L.P.