Social media bosses need to do a lot more to check age of their site's youngest users

NationalWorld reporter Rochelle Barrand finds it baffling that strict checks are carried out on elderly people who want to have social media accounts, but not children who may not actually be old enough to have them
Watch more of our videos on Shots! 
and live on Freeview channel 276
Visit Shots! now

Every day there is a new social media trend, and sadly many of them seem to be harmful to users or may even prove to be fatal. My heart is heavy every time I have to write about yet another challenge or trend which has claimed the life of at least one person, sadly usually a teenager or young adult.

Names include 14-year-old Leon Brown, who lost his life last year after attempting the blackout challenge, and Jacob Stevens, age 13, who died after taking part in the Benadryl challenge. Then there’s 16-year-old Christy Sibali Dominique Gloire Gassaille who passed away after taking part in the scarf challenge. Esra Haynes, age 13, died after taking part in the chroming challenge. It was thought 12-year-old Archie Battersbee had also taken part in the blackout challenge, before a coroner ruled this out. 

Hide Ad
Hide Ad

It’s not just young people who lose their lives; a 35-year-old woman died in Portugal after taking part in the Blue Whale suicide game and a 34-year-old man called Wang Moufeng died after drinking an excessive amount of high percentage alcohol in another dangerous challenge.

Then there are the people, again often young people, who may keep their lives after participating in online challenges but have been left with life-altering consequences. Mason Dark, aged 16. suffered three degree burns after taking part in the aerosol challenge. Other teens are also said to have almost lost their lives after participating in the knockout challenge and the tap out challenge. Teenagers aged 15 to 17 were also left needing treatment after taking part in the paracetamol challenge.

The list of dangerous challenges goes on and on, and I’m sure so does the list of victims. I’m certain there will be many names or young, innocent faces which are not known to the media or the wider public because their families have chosen to remain quiet in their grief. 

All of this has only reinforced my view that only adults aged 18 and above should be allowed on social media sites, a topic I wrote on earlier this year. But, I accept that, right now, that’s not the reality of the world we are living in. Right now, anyone aged 13 or over can sign up for a profile on the biggest social media platforms; TikTok, Facebook, X and Instagram. Or, at least they can here in the UK. There are some slight variations around the world, but they are all open to teenagers aged 13 plus and not yet 18. So, then that got me thinking about the responsibility that social media sites have to protect their youngest users. 

NationalWorld reporter Rochelle Barrand thinks social media bosses need to do more to stop underage people from creating profiles on social media sites. Composite image by NationalWorld/Mark Hall.NationalWorld reporter Rochelle Barrand thinks social media bosses need to do more to stop underage people from creating profiles on social media sites. Composite image by NationalWorld/Mark Hall.
NationalWorld reporter Rochelle Barrand thinks social media bosses need to do more to stop underage people from creating profiles on social media sites. Composite image by NationalWorld/Mark Hall.

'It’s a couple of clicks or scrolls of a mouse and they’re in'

Hide Ad
Hide Ad

The Online Safety Bill, which is set to become law and will see a crackdown on harmful content, will help as it will force social media bosses to be accountable if rules are not followed - but it won’t stop people who are just too young to have an online presence signing up in the first place.

Every social media site states that they only allow UK users aged 13 and above to sign up to their platforms. They also claim that any accounts found to be operated by people younger than this will be suspended, and then that person will be prevented from creating another account. But, does this happen in practice? Anecdotally, I think we all know it doesn’t. Kids who are desperate to join up simply have to add a few years on to their actual birthdate. It’s a couple of clicks or scrolls of a mouse and they’re in. There aren’t any proper checks. Children that young, of course, don’t really have any official form of photo ID; they’re too young for driving licences and they may not yet have a passport. At best, they may have a travel pass of some kind, but that doesn’t class as official identification. So, it is very hard for these platforms to perform any proper checks.

That in itself is another reason that screams to me that these sites should be kept for adult use only, and in that instance passports or driving licences checked or, in cases where they are not available - because it is wrong to assume that every adult will have a passport and/or a driving licence - national insurance numbers could be checked.

But, I digress. Well, slightly. Four years ago, my grandma decided that she wanted an Instagram account. Yes, my grandma, who will turn 94 in October, is pretty tech savvy. She’s been a Facebook user since 2011, when she requested that myself and my twin brother make her an account so she could see any photos we posted as she started university. She only has family members as friends, but she knows the basics. Put it this way, I hope I’m as good as her with technology when I’m the same age!

Hide Ad
Hide Ad

Anyway, for Christmas 2019 my mum, brother and I gifted my grandma with an iPad as her last tablet had broken. While my brother and I helped set it up for her she commented that she didn’t see anything on my brother’s Facebook account anymore. He told her that he posted mainly on Instagram - and she then made a request for an Instagram account.

I followed the sign up process using my grandma’s details and was surprised, and bemused, to find that her request for an account was blocked. I admit I cannot remember the exact phrasing, but the message received said something along the lines of needing to verify that my grandma truly did want an account. In essence, they did not believe that a then 90-year-old lady would actually want to be on Instagram.

They asked my grandma to submit a photograph of herself holding up a sign which she had written which confirmed her name, date of birth and that she did really want a profile. We laughed about it and intended to do it later - only later never came during the busy Christmas period, and then my brother started posting on Facebook once more so I guess she lost interest in having an Instagram account and we never submitted a photo. I do wonder what would have happened though if we had done so. Would the account have been allowed or would Instagram have put another road block in the way and continued to question my grandma’s age?

'Perhaps we do need officially recognised photo ID for every child aged 13 and over'

What I find bizarre about this, however, is that there seemed to be very rigorous - and, if I’m honest, baffling  - checks in place to verify the age of an elderly user but there are no attempts at checks of any kind when it comes to young users. Now, I realise that asking a pre-teen or teenager to hold up a sign confirming their age and intention to have a social media account would be unhelpful. After all, in a lot of cases children can look older than their years and it can be impossible to tell if a child is 12 or 13 on the basis of how they look alone. And, just as they can, and do, lie when inputting their date of birth in to the sites now it would be all too easy for them to lie when writing their date of birth on a piece of paper for a photo too.

Hide Ad
Hide Ad

I’m aware that nothing would be fool-proof. But, at least there could be an attempt by these social media bosses to ensure only people who are actually old enough to be users on their platforms are on them, while underage people are kept out. Perhaps they need to have their account verified by an adult who would have to confirm the youngster’s age and then also submit their own passport, driving licence or national insurance number to confirm their identity. Although I know it would then still be possible for young people to submit such details without the knowledge of the adult in question, so maybe there needs to be another layer. Perhaps the child and the adult do need to sit together and take a photo which can only be taken in the moment during the application process also. Perhaps the child’s account needs to be linked to the adult’s account, and the adult will obviously be notified of this.

Or, perhaps we do need some form of officially recognised photo ID for every child aged 13 or over. I don’t claim to have the perfect answer to how we solve this issue, but something has to be better than nothing.

'Bosses need to part with some of their billions and put proper checks in place'

TikTok is supposedly valued at $66 billion (more £58 billion), Facebook is valued at $770 (more than £630 billion), Instagram is valued at around $34 billion (more than £27 billion) and X is now valued at around $40 billion (£32 billion). These figures are estimates, of course, but one thing is clear - these companies can spare a bit of cash to invest in systems or employ people who will perform proper checks on the age of their users. 

Mark Zuckerberg founded Facebook and is now a billionaire and the CEO of its parent company, Meta. Meta's other products include WhatsApp, Instagram and Threads. It’s also been nearly a year since billionaire Elon Musk bought X, which was then called Twitter. TikTok was founded by Chinese technology firm ByteDance Ltd but today, roughly sixty percent of the company is beneficially owned by global institutional investors. Some of the details around TikTok’s ownership remain unclear - but I’m certain that those in charge of all of these sites could bear to part with some of their own money also to ensure the platforms they own have correct checks in place.

Hide Ad
Hide Ad

It’s not just the potentially fatal challenges we have to be aware of on social media. There seems to be a growing trend for these platforms being used as a place to encourage crime. In August, plans for anti-social behaviour on London’s Oxford street were organised via TikTok.  Several people were arrested after the incident, and the actions of those who took part were condemned by Prime Minister Rishi Sunak and also Donna Jones, the chairwoman of the Association of Police and Crime Commissioners, who said the event was “incredibly worrying”. Then, earlier this month, people began using TikTok to share their tips on how to successfully shoplift and even shared videos of their shoplifted goods in a trend known as ‘borrowing hauls’.

TikTok say they have over 40,000 safety professionals who are dedicated to keeping the platform safe, and that they review content flagged to them and remove any which is found to violate their community guidelines. They also admit that they won't catch every instance of violative content, but say that it is wrong to categorise violative content as a TikTok-specific issue as it can be found on other platforms. Perhaps some of these safety professionals could also spend their time verifying the age of young users, maybe using some of the methods I have suggested above. Or, if their time is already taken up on other tasks ensuring safety then TikTok simply needs to hire another few thousand people specifically for this purpose. And other social media platforms need to follow suit quickly.

Comment Guidelines

National World encourages reader discussion on our stories. User feedback, insights and back-and-forth exchanges add a rich layer of context to reporting. Please review our Community Guidelines before commenting.