John Humphrys - Online safety: Who should make the rules?

May 10, 2024, 2:29 PM GMT+0

The big wide world is a dangerous place for children. Always has been. Always will be. The return of whooping cough is the big health story of the week. I had it myself when I was a baby. A very nasty dose apparently. Bizarrely, it’s one of the reasons I am known as John, rather than the name on my birth certificate, which is “Desmond’. My mother worried that I’d be known as ‘Dismal Desmond’ and ordered that I be called John Instead. And it stuck.

But when I was in school in the early fifties the disease we feared the most was polio. Every kid in my poor area of Cardiff knew at least one youngster condemned to hobble around with nasty callipers attached to a leg deformed by the disease. And, far worse, every so often our parents would whisper about a child condemned to the ‘iron lung’ because polio had struck the chest. The ‘iron lung’ was a giant, airtight metal cylinder connected to a bellows which would suck the chest open, forcing air to rush in to fill the lungs. If the child was lucky they might be in the hideous contraption for a few weeks or months, but some had their lungs permanently paralysed.

Then, in 1955, salvation arrived in the shape of a sugar cube coated with the newly discovered polio vaccine. I remember to this day standing in line with every other kid in the school waiting to receive our little sugar treat. Lifesaver might have been a better description. And I reflected on those dark post-war days this week when it was announced that action has been taken to save children from a threat that many believe is – in very different ways – every bit as dangerous as any faced by youngsters like me all those years ago.

That threat is, of course, social media and the danger it can pose to the mental health of some children. In far too many cases it has resulted in them committing suicide.

For several years now it has never been far from the news headlines, but the big story this week was the announcement that Britain has taken the lead in ­attempts to keep children safe online. New rules have been published by ­Ofcom, the communications industry watchdog, which may well have implications not just for those social media giants operating in this country but around the world. Many governments and regulators are watching to see whether the new regime works. Others are deeply sceptical because of the threat it may pose to free speech.

Where do you stand?

The new rules are the result of the Online Safety Act passed last year. Social media apps including TikTok and Instagram will be told to “tame aggressive algorithms” pushing harmful content to children.

It was back in 2018 that the government first promised a tighter regulatory system but since then we have seen a steady stream of horrifying cases in which young people – mostly teenage girls – have committed suicide because of the malign influence of sites which feed on insecurities about their appearance or lack of friends or might even offer subtle encouragement to use harmful drugs.

The Times claimed that TikTok, the wildly popular Chinese video feed, has ‘leapfrogged American sites such as Instagram and Twitter/X by making even more effective use of ­algorithms designed to hook users. Its competitors have responded by tweaking their algorithms to emulate TikTok’s. Social media sites, as a result, have become more addictive and more dangerous.’

According to the Centre for Countering Digital Hate it was TikTok who pioneered the use of algorithms that learn from users’ ­behaviour to push content on to their home feeds and thus ‘create a highly compulsive experience. The platform sent teenagers suicide, self-harm and eating disorder content minutes after they had joined…As TikTok has achieved success, its rivals have copied the algorithmic feed.’

The Molly Rose Foundation was set up by Ian Russell, whose 14-year-old daughter Molly killed herself after watching online content that encouraged self-harm and suicide. found that almost all the material recommended was harmful. Mr Russell said: ‘The regulator has proposed some important and welcome measures, but its overall set of proposals need to be more ambitious to prevent children encountering harmful content that cost Molly’s life.”

The Online Safety Act is widely considered to be one of the toughest regimes in the world for tackling harmful content. Ministers claim it will make Britain “the safest place in the world to be online”. US efforts in this area have stalled. Australia and Europe have brought in new laws, although they are not as comprehensive.

In the UK, messaging services such as WhatsApp and Snapchat will have to make sure children are not added to group chats without their consent. Under-18s will be given more control to block and mute other social media accounts and disable comments under their posts, which can become a source of abuse. The platforms will also be forced to invest in content moderation “to ensure swift action is taken against content harmful to children”.

If tech companies do not co-operate, the regulator will be able to fine them £18 million or 10 per cent of global ­revenue, block their services and start criminal proceedings against senior managers.

At the centre of the new provisions is a requirement for digital platforms to know whether their users are children or adults. The problem is that it is all too easy for children to lie about their age. Research suggests that a third of children as young as eight do so. Under the new regulations companies will have to use facial age verification or even require users to provide ID. Different content rules will apply to children and adults. Platforms will be required to ensure children see no pornography, or material promoting suicide, self-harm or eating disorders.

But many believe the rules don’t go nearly far enough. There will not, for instance, be an outright ban on material which can be seen to be promoting substance abuse or violent or pornographic behaviour. Instead companies like TikTok will be required to protect children from such harmful material. But that, of course, assumes that children can be excluded from group chats. They ask: ‘How can you stop a ten year-old joining in a chat with a big brother or sister?’

It has taken six years for the government and Ofcom to produce these new rules and Dame Melanie Dawes, Ofcom’s chief executive, denies that they are too timid. She says: ‘Our proposed codes firmly place the responsibility for keeping children safer on tech firms. They will need to tame aggressive algorithms that push harmful content to children in their personalised feeds and introduce age-checks so children get an ­experience that’s right for their age.’

The proposals have been mostly welcomed but there are reservations. The Molly Rose Foundation was set up by Ian Russell, whose 14-year-old daughter Molly killed herself after watching online content that encouraged self-harm and suicide. He said: ‘The regulator has proposed some important and welcome measures, but its overall set of proposals need to be more ambitious to prevent children encountering harmful content that cost Molly’s life.” Mr Russell is particularly concerned that “depressive” content such as his daughter saw will not be treated ­severely enough.

Michelle Donelan, the technology secretary, said: ‘The regulator has been clear — platforms must introduce the kinds of age-checks young people experience in the real world and address algorithms that too readily mean they come across harmful mat­erial online.’

The question is: who should control that material?

‘Article 19’ is one of the many organisations who regard the new law as posing an unprecedented threat to the privacy and, indeed, the security of every citizen of this country. It wants a ‘world in which people can ‘freely express themselves and actively engage in public life without fear of discrimination. Its two ‘interlocking freedoms’ are, it proclaims, the freedom to speak and the freedom to know.’ Both, it claims will be seriously endangered by the new regulations.

It also claims the bill gives far too much power to the government over the ‘supposedly independent’ regulator Ofcom. It claims that the Secretary of State’s powers to interfere with Ofcom’s regulatory oversight ‘completely undermines the idea that Ofcom can be independent in the performance of its duties.

The fear expressed by companies such as TikTok and, indeed, WhatsApp is it could break end-to-end encryption and open the door to routine surveillance of personal messages.

Another industry body, the Electronic Frontier Foundation warns that the bill ‘won’t just affect the U.K.—it will be a blueprint for repression around the world.” And the tech companies who wrote the open letter pointed out that the United Nations “has warned that the U.K. Government’s efforts to impose backdoor requirements constitute ‘a paradigm shift that raises a host of serious problems with potentially dire consequences’.

So whose side are you on? Let’s accept that all decent people worry about the threat to children from uncensored material that might cause them to harm themselves or even commit suicide and ask the question: is this new law the right way to go about protecting them?

Let us know what you think.

Explore more data & articles