Q&A: Social media apps and minimum age requirements
Campaigners have labelled Meta’s decision to lower the minimum age limit to use WhatsApp in the EU and the UK from 16 to 13 as “tone deaf”, saying it puts profit before child safety.
The social media giant said the change in Europe would “ensure a consistent minimum age requirement for WhatsApp globally”, but it comes at a time when many are raising concerns about the negative impact smartphones can have on young people, with calls to ban either social media or access to smartphones for under-16s.
Here is a closer look at the issue.
– What has changed on WhatsApp?
As part of a wider update to its terms of services and privacy policies, to bring the platform in line with new EU online safety laws, the messaging platform said it was lowering its minimum age requirement in Europe from 16 to 13.
The Meta-owned service said this will make it “consistent” with the minimum age requirement in had in place elsewhere around the world.
– What has been the response?
Campaign group Smartphone Free Childhood urged the platform to reverse the change, saying it was a cynical attempt to gain more users in Europe.
Daisy Greenwell, the group’s co-founder, said WhatsApp was putting “shareholder profits first” and called the decision “tone deaf” given the “increasingly loud alarm bells being rung by scientists, doctors, teachers, child safety experts, parents and mental health experts alike” around social media and closed messaging apps.
– What have experts and others said?
A number of recent studies have shown parents have concerns over social media and smartphone use.
A recent poll from charity Parentkind found that more than four in five (83%) parents said they felt smartphones were “harmful” to children and young people, and 58% of parents believe the Government should introduce a ban on smartphones for under-16s.
Several other studies have suggested young people are being exposed to a range of harmful content online, while teachers and young people themselves have reported incidents of cyberbullying.
In February, the Department for Education (DfE) published guidance, which is non-statutory, which instructed headteachers on how to ban the use of phones in schools, not only during lessons, but during break and lunch periods as well.
– Is WhatsApp an outlier by setting its minimum age to 13?
No. Most other social media and messaging platforms, including Facebook, Messenger, Instagram, Snapchat, TikTok, and X – formerly Twitter – list 13 as their minimum age requirement in their terms of service or community guidelines.
– So why has the WhatsApp change sparked such a reaction now?
The WhatsApp change has come at a time of heightened debate about age restrictions for young people using phones and social media.
Esther Ghey, the mother of murdered teenager Brianna Ghey, is campaigning for an age limit for smartphone usage and stricter controls on access to social media apps.
Daniel Kebede, the head of the National Education Union (NEU), the largest education union in the UK, has also called on the Government to hold an inquiry into “dangerous” content which young people are able to access on their smartphones.
And there have been reports that the Government is considering bringing forward proposals to ban the sale of mobile phones to under-16s.
– What about existing online safety laws?
The Online Safety Act passed into law last year, and online safety regulator Ofcom is already working on codes of practice to help it enforce the rules, but they will not begin to take effect until next year.
It requires social media companies to curb the spread of illegal content on their platforms and protect children from seeing potentially harmful material, with large fines among the potential penalties for those who breach the new rules.
But Ms Ghey has said she believes it does not go far enough.
She told BBC Radio 4’s Today programme in February that “the internet and social media is so vast, I think it will be so hard to police”.
“The second point is as well, that comments and free speech – and don’t get me wrong, I am all for free speech – but some of the comments I have seen on social media posts and some of the articles that I’ve done, they are just hateful comments,” she said.
“The Online Safety Bill is not going to protect children and young people from seeing that kind of horrible content because I don’t think that will be deemed as harmful.”
But Prime Minister Rishi Sunak has backed the legislation, telling the BBC that the Act would give the regulator powers to ensure social media companies are protecting children from harmful material.
He said: “They shouldn’t be seeing it, particularly things like self-harm, and if they don’t comply with the guidelines that the regulator puts down they will be in for very significant fines, because like any parent we want our kids to be growing up safely, out playing in fields or online.”