For 48 hours, leading Jewish and non-Jewish celebrities, politicians and others left Twitter. The walkout followed a failure to act, by both Instagram and Twitter, against a litany of anti-Jewish racist and hateful posts by Grime artist Richard Cowie, otherwise known by his moniker Wiley. Cowie’s messages were widely shared and likely absorbed by his fans, many of them impressionable young people.
Jewish individuals, organisations and allies are beginning to return to the digital platforms after the walkout. The key question is what now? The answer is action, and the focus must be, the Government’s Online Harms Bill.
For those that have not followed the story to date, there has been work over more than ten years, not just in the UK but across the world, to try and encourage Governments to introduce appropriate regulation for social media and other platforms online. Lord John Mann, then an MP in the Commons, worked to establish an international parliamentary coalition which brought together social media company heads of safety for the first time, and despite some progress, and much urging from Ministers since then, voluntary efforts have been insufficient.
Eventually Australia introduced a licencing scheme and Germany introduced a huge fining structure and corporate responsibility model to tackle online harms. In Britain, our Government declared an intention to be world leaders, promising an Internet Safety Strategy that would both ensure businesses could thrive and users could be safe online. First there was a ‘Green Paper’ between 2017-2018. Then the next stage, the Online Harms White Paper. It has now been more than 15 months since the Government began consultation on the latter, but the full response is yet to have been issued. The Government promised a full Bill in July, but the House of Commons is now on recess until September.
So, if the question is what can be done now, bringing the Bill forward now could be the answer. However, that Bill must not be weak if we are to be the world leaders the Government wishes us to be. First, it must establish the right regulator. That means Ofcom, the UK’s communications regulator, with additional expertise drawn in from others like the British Board of Film Classification (BBFC). Second, we need to have a statutory Duty of Care on the books. That should mean that not only must social media and internet companies have Terms of Service or Community Standards in place, but that those Terms or Standards should meet minimum thresholds. Platforms should be accountable for the Terms and so must face sanction when they fail to enforce them. Similar to the liability held for Financial Services or Health and Safety executives in the UK, leaders of these platforms should be made personally liable for major breaches of the Duty of Care. This would force social media executives to act. It certainly shouldn’t be for minority communities to always have to prove the case for action. Fourth, we need to see action on legal but harmful material. Cases like the ‘Pizzagate’ affair, which saw a gunman in America attack a family pizzeria after imbibing legal but harmful material online, underline why companies must have plans in place to address these issues. Fifth, there should be statutory codes of practice which guide start ups and others for how to address harms others have already uncovered and dealt with. Finally, and perhaps most controversially, I believe introducing a form of publisher’s liability should be on the Government’s mind. Defining content service providers as separate from Internet Service Providers may help.
This is going to be the battleground over the coming months. Many others, not least the companies themselves will push back but for over a decade they have facilitated the broadcasting of hate and in doing so have helped to toxify our public life and discourse. The question is what now? The answer, is action.