Does the Online Safety Bill really keep us safe?
It was a very big day for the UK’s digital world last Thursday, as the Government’s Online Safety Bill was finally published.
Green paper, White Paper, interim and full responses, pre-legislative scrutiny but we are finally at the dawn of a new era in which technology companies will no longer make the rules for themselves.
I have followed the whole process, and so there was a degree of catharsis for me in finally seeing the text which will now form the parliamentary battleground for the effort to hold social media and other digital platforms accountable for the hatred spread online every day.
In general, it is of course welcome to see the Bill, which at over 200 pages is quite the read.
We know that Ofcom, familiar to those that have ever complained about a TV programme, will take on the mantle of digital regulator. It will produce codes, assess compliance and will have (now improved and clearer) powers both to fine companies and to hold senior managers to account for failures. So far, so good.
The Bill has also sought to draw the Government’s line on the balance between freedom of expression and tackling harms.
It is a shame that on this, they haven’t yet got it right. From my perspective, it may not go far enough in some areas, and in others leaves loopholes that hardened racist provocateurs can potentially jump through.
To give specific examples, at present, certain platforms will be responsible for preserving ‘journalistic’ content or that of ‘democratic importance’.
Unfortunately, as both Hope Not Hate and I pointed out repeatedly to those scrutinising the Bill, Stephen Yaxley-Lennon (Tommy Robinson) defines himself as a journalist. The Bill might well enable his media company to have special privileges in relation to contesting platform decisions about its content.
So too, any racist candidate signing up to stand in an election – or a misogynist seeking to oppose the Women’s Equality party, could argue their speeches are ‘content of democratic importance’.
This is a significant gap and leaving the platforms to decide what the definitions are, undermines the purpose of the Bill.
Of equal gravity is the failure to properly understand the dangers of harmful content on high risk, high harm but small platforms and that caused by search engines.
There are two categories of service the Bill will provide for. Category 1 platforms have additional duties and are defined by ‘size and functionality’. The definition of functionality in the legislation does not include risk, in fact risk is absent as a determinant of category – a serious problem for the Bill.
The joint committee which reviewed the draft Bill, recommended risk – not size or functionality – be the key factor on applying categories, Parliament’s Petitions Committee did too. The Government maintains that size is key.
This fundamental block in understanding is hugely worrying. Read the Community Security Trust’s Hate Fuel report and you will learn that smaller, niche platforms contain some of the most violent, horrific and toxic antisemitic or other racist content which can both inspire real world harm but, in some cases, still be legal.
No extra duties for these platforms? No responsibility to address the harms parliament will say Facebook or Twitter must address? No. If we are not careful, we simply let neo-Nazis, Holocaust deniers and racists enjoy their choice of niche platform.
This is a structural failure of the current drafting and one which I will be seeking to address, alongside many other stakeholders.
There is also a significant blind spot in relation to search engines. Facebook and Twitter will have duties to address legal but harmful content. Google? No. This despite its search bar having directed people to the phrase ‘Jews are…. evil’ for a time, and despite its algorithms drawing pictures of ovens into a main viewing carousel when people searched for Jewish bunk beds.
In a meeting with Ministers and officials after the draft Bill had finished the scrutiny process, I was told that concerns about search and small platforms had been heard and understood.
Disappointingly, it is now clear that they were not. The public will rightly be concerned, and I hope will ensure that MPs and Ministerial postbags include complaints about this dissatisfactory arrangement in the Bill.
There will be fierce debates in parliament as the Bill makes its way through the House.
Some have raised concerns that it will be a Christmas Tree (Chanukah bush, if you will) Bill, in that people will hang their own amendments on it, so that it becomes unwieldy or unworkable.
However, my preference is that parliament gets its say, and that too many critical issues are not left to secondary legislation in which Government just pronounces its will.
The Bill is, however, large and so it is therefore important that the focus remains on the key elements.
Platform systems must targeted. Illegal hate materials must not be spread, hate should not be promoted.
My focus will be on ensuring those systemic measures apply across not only the big platforms but all the areas of our digital world to which hate can spread.
This is a once in a generation opportunity to change the way we do business, I do hope readers will play their part in calling on their MPs to make this legislation the world leading law it could and must be. We have a fight ahead.