Online Safety Bill: children will be banned from parts of the Metaverse by stricter age verification tools
The National Crime Agency has warned that the sexual exploitation of children in online spaces “is increasing in scale, severity and complexity”.
and live on Freeview channel 276
Concerns have been raised in the past that certain apps or sites within the virtual reality world allow under 18s to access unsuitable, inappropriate or dangerous spaces - such as virtual strip clubs, as previously reported by the BBC. But the government told NationalWorld that its upcoming legislation will prevent children from entering “harmful” spaces through the use of “age assurance technology, such as age verification.”
A spokesperson said: “The Online Safety Bill will capture all services where users can interact online, from websites to virtual reality apps on the Metaverse, and require them to use tough measures such as age verification to stop under 18s from accessing this content. If companies fail to tackle harmful material effectively, they will face huge fines and potentially criminal sanctions against their senior managers."
The Metaverse is a single, shared virtual reality space, allowing users to interact with people all around the world in the form of avatars. This has been rolled out by Meta, the owner of Facebook and Instagram, however other platforms and games, such as Fortnite, also refer to their spaces as the “metaverse”. Mark Zuckerberg has previously said it is the future of the internet, however it has not taken off in popularity like social media platforms. And the virtual reality space has been criticised for allowing paedophiles to abuse it.
The National Crime Agency has warned that the sexual exploitation of children in online spaces “is increasing in scale, severity and complexity”, with the industry “detecting and reporting an increasing number of illegal images” each year. It said it believes there are currently 550,000 - 850,000 paedophiles in the UK who pose a danger to children, as it argued social media companies are “not doing enough” to reduce the threats to young people.
However, under the upcoming Online Safety Bill - which the government says aims to keep children safer while using the Internet - these companies will be forced to take action or else face fines or potentially up to two years in prison. Regulator Ofcom will monitor and examine the steps being taken, with the government saying it expects it to take “a robust approach to sites that pose the highest risk of harm to children” - once again mentioning the use of “a high confidence age assurance technology”.
But the legislation has proved highly controversial, facing backlash from critics who have expressed concern about over-censorship - and from social media companies who have argued it threatens privacy. WhatsApp head Will Cathcart for instance previously warned that he would not comply with the bill’s requirements to weaken end-to-end encryption, meaning the app could end up banned in the UK.
Mr Cathcart argued that altering Whatsapp’s encryption systems would undermine the security of users’ messages, leaving them vulnerable to threats such as hacking. He also raised concerns about people’s freedom online, remarking: “I don’t know that people want to live in a world where to communicate privately to someone it has to be illegal.”
Meanwhile, the NCA, which told NationalWorld it is yet to see the “commitment”, “action”, and “big shift” needed from the social media industry when it comes to tackling online threats to children, insisted it is “possible for privacy and child safety to co-exist”.
A spokesperson continued: “Yet, platforms continue to push forward with plans to implement privacy enhancing measures such as end-to-end encryption, which do not yet have safety features designed into them.”
The agency also said there are further threats within the Metaverse, as while the risks are largely the same, the safety measures here are “less secure”.
Outside of the debate surrounding encryption, one of the key aspects of the Online Safety Bill will be tougher rules and laws on age verification. Julie Dawson, chief policy and regulatory officer at Yoti, a platform which “makes it safer for people to prove who they are”, told NationalWorld that the market is ready to offer this service.
“There’s finally a healthy ecosystem of age verification providers,” she said. “So these tools can be used to protect children from accessing sites for over 18s, and it can actually improve user experience on certain sites which may cater to a wide variety of demographics.
How does online age verification work in practice?
There are a range of ways people can prove their age online, with traditional options including uploading physical ID, such as a driving licence or passport, or using a credit card. However, Ms Dawson said that a newer option Yoti offers has actually proved significantly more popular with users. This is the site’s ‘face estimation tool’.
“How it works,” Ms Dawson explained, “is by giving users the option to have their webcam take a snapshot of their face - and then the tool estimates their age. The image is then instantly deleted, and crucially, no one is recognised or identified. So it’s not facial recognition, but facial detection - for the purposes of age estimation and verification.”
The AI tool was created by feeding more than 500 million photos into its system, accompanied by the date of birth of the people in the images. It then learns what to look out for in order to detect someone’s age, with stats showing that 99% of 6 - 11-year-olds are correctly estimated as under 13, and 99% of 14 - 17-year-olds correctly estimated as under 21.
This can be implemented on website and devices, e.g. headsets, and, according to Ms Dawson, is already being used by global companies such as Instagram, Facebook Dating, and Only Fans.
Something else that is being looked at is the notion of creating ‘age spaces’ within particular apps or websites. Ms Dawson explained: “Some sites, such as OnlyFans, are categorial. They only want over 18s on their platform - so they just need a strict age verification tool for people to prove their adults.
“However, other places on the Internet, such as Minecraft or Roblox, have more mixed audiences. So they may want to keep both over and under 18s on their platform, but they may want to separate the places in which they exist.”
This, she said, could be done by allowing a younger user who is particularly skilled to play with older users on a gaming app, but then perhaps preventing them from engaging in chat functions with unknown adults. Alternatively, a website could create ‘spaces’ so that inappropriate ads are not being served to someone of a certain age.
It is thought this is how the Metaverse will have to limit children from accessing certain virtual reality spaces. “In these instances, the age estimation tool would direct an under 18 user into an age-appropriate space, and aso prevent someone over 18 from accessing the 13 to 17-year-old section, for example,” Ms Dawson said.
“This is where age assurance can actually provide a better more positive experience, that is differentiated based on the user - and tailored to their needs.”
Meta did not respond for repeated requests for comment. The Online Safety Bill on Wednesday (19 April) entered the Committee Stage, where it will be scrutinised by members of the House of Lords.