It’s easy to talk tough on tech, as Michelle Donelan, the secretary of state for science, innovation and technology has shown this week. In an interview with the Telegraph, Donelan warned that social media platforms could be on the hook for “humongous” fines if they allowed under-13s to remain on their platforms. “If that means deactivating the accounts of nine-year-olds or eight-year-olds, then they’re going to have to do that,” the cabinet minister said.
The approach sounds all well and good in theory. It’s red meat to the pearl-clutching, law and order Tories who believe the world is full of danger, and tech companies are to blame. Don’t get me wrong – there is plenty to lay at the feet of social media companies for the harm they have caused. But the tough talk is part of a wider tendency in our politics that ignores the reality of how we interact with the internet – and demands a degree of censorship that is not only unworkable, but counterproductive.
Certainly, the internet can be a cruel place, and what happens online can have real-world ramifications. Disclosures made following the campaigning of the parents of Molly Russell, the teenager who took her life after being bombarded by online content related to suicide, self-harm and depression, have been chastening. And documents leaked by former Facebook employee Frances Haugen laid bare the access many users had to distressing content and misinformation.
Nevertheless, Donelan’s demands that tech companies cut off access to their tools for under-13s are the equivalent of the misguided practice of abstinence-only teaching in sex education classes at school: you can pretend kids aren’t doing it, but they will anyway. Ducking the conversation entirely will leave them ill-equipped to handle issues as and when they arise online.
Take it from me: I’m in my mid-30s, and grew up on the internet in its wildest days. Between the ages of 10 and 18 I saw things I shouldn’t have, and interacted online with people far beyond my age. Luckily, nothing bad happened – though for some it undoubtedly does – but it was a formative experience. I learned how to keep myself safe through trial and error, and conversations with my peers.
In hindsight, I wish I had been open enough to have similar conversations with members of my family – but I had the sense they would take a similar approach to Donelan and simply ban access to such platforms.
Just as we’re starting to recognise the issues caused by “helicopter parenting”, realising that letting children roam with less strict oversight will ensure they grow up to be independent, rather than dependent, adults – similarly we need to allow them a little more slack for online exploration.
Some platforms already develop child-safe versions of their apps, or allow parental controls to be implemented on accounts, acting as a child monitor of sorts. While far from perfect, these are more practical solutions than simply saying children aren’t allowed to access some of the most attractive areas of the internet.
Perversely, Donelan’s draconian demand that tech companies outlaw younger users from accessing their platforms will probably have the opposite effect to the one intended. If you make it punitively expensive for companies to acknowledge that underage users might be on their platform, they will do their level best to prevent children from accessing it. But kids will still slip the net – and easily so, given that most online age checks consist of little more than asking users to declare their date of birth. Any child able to take away 13 from 2023 will be able to add a few years to their age in an instant. Video age verification, where AI tries to discern the age of an individual using a video of their face, can be fooled by the judicious use of makeup.
And in this imagined future, tech platforms will pretend that children don’t inhabit their digital halls, and turn a blind eye to their existence. The online safety bill’s provisions are such that the companies affected will still largely be self-policing. The communications regulator Ofcom will be empowered to fine companies for not following the rules – but it’ll be up to the companies themselves to disclose where things go wrong. If the punishments are too significant, it’s easier for firms to pretend the problem doesn’t exist rather than risk losing income.
Instead, it’s worth being open and honest. Yes, children will want to access social media platforms. Yes, they will do so whether you want them to or not. Yes, parents should accept that. But they should also have grownup conversations with their children about being online, the possibility for fun – as well as the potential for danger – that comes with exploring the vastness of the internet.
Unfortunately the tack Donelan is taking promises a bad outcome for everyone: the platforms will see no evil, children will speak no evil, and so parents will hear no evil. Instead, we need to understand that it’s impossible to put the genie back in the bottle, and take a more realistic approach. Let children experiment online. But make sure everyone responsible is watching.
-
In the UK, the youth suicide charity Papyrus can be contacted on 0800 068 4141 or email [email protected], and in the UK and Ireland Samaritans can be contacted on freephone 116 123, or email [email protected] or [email protected]. In the US, the National Suicide Prevention Lifeline is at 800-273-8255 or chat for support. You can also text HOME to 741741 to connect with a crisis text line counsellor. In Australia, the crisis support service Lifeline is 13 11 14. Other international helplines can be found at befrienders.org
-
Do you have an opinion on the issues raised in this article? If you would like to submit a response of up to 300 words by email to be considered for publication in our letters section, please click here.