Online Safety Bill: peers demand changes to social media so algorithms cannot promote ‘harmful’ content
and on Freeview 262 or Freely 565
Peers have demanded changes be made to social media algorithms so that “harmful” content such as “Andrew Tate videos” cannot be promoted via the mechanism.
The House of Lords has voted by 240 to 168 in favour of introducing a series of changes to the government’s Online Safety Bill - which is set to make sweeping reforms in the hopes of keeping children safer while they use the Internet.
Advertisement
Hide AdAdvertisement
Hide AdCrossbench peer Baroness Kidron, who pushed for the amendments, argued that the upcoming legislation should prevent social media companies from pushing children and young people towards “harmful” content via algorithms - a mechanism which websites use to decide which images, videos, or content their users are most likely to interact with.
Lady Kidron, who is also founder of the 5Rights Foundation, gave the example of controversial social media influencer Andrew Tate - claiming that teenage boys had been directed towards his videos because of a “content-neutral friend recommendation” mechanism. She said that these algorithms were encouraging youngsters to view Tate’s content “simply on the basis that other 13-year-old boys are like each other and one of them has already been on that site”.
Tate was recently charged with rape, human trafficking, and forming a crime gang to sexually exploit women. He has vehemently denied all allegations.
Over the past few years, Tate has become notorious for the views he has shared in his online videos. These include his belief that women belong in the home, that rape victims must “bear some responsibility” for their attacks, and that 18-year-olds are more attractive than women over 25 because they have “been through less dick.”
Advertisement
Hide AdAdvertisement
Hide AdContinuing her speech, Lady Kidron said: “To push hundreds of thousands of children towards Andrew Tate for no other reason than you benefit commercially from the network effect is a travesty for children and it undermines parents.”
The peer also said she “could not accept” the government’s argument that “all harm comes from content”. Instead, she claimed that harm can also come from the way in which companies are set up, remarking: “In a world of AI, immersive tech, and augmented reality, is it not dangerous and indeed foolish to exclude harm that might come from another source other than content?”
Lady Kidron’s proposals were backed by Conservative Party peer Baroness Harding, who gave another example of how “harm” can derive from things other than content.
She told the House of Lords how she had used “brilliant” technology to keep track of her teenage daughter during a school trip to the USA, but added that each time she uses the tool, “a shiver runs down the back of my spine thinking how easy it would be for a predator to do the same thing” - if ministers do not recognise “that non-content harm is a real and present danger.”
Advertisement
Hide AdAdvertisement
Hide AdLady Harding, who led the NHS Test and Trace programme during the pandemic, went on: “We have all got ourselves tangled up in the structure of it and if it is not on the face of the Online Safety Bill that non-content harms are indeed real harms, the risk of it not being clear in the future is very, very great indeed.”
The government’s culture minister Lord Parkinson, who had urged peers to vote down the amendments, claimed the changes could “weaken” the Online Safety Bill.
Commenting on the vote, he said: “The bill’s online safety objectives include that regulated services should be designed and operated so as to protect people in the United Kingdom who are users of the service from harm, including with regard to algorithms used by the service, functionalities of the service, and other features relating to the operation of the service.”
Comment Guidelines
National World encourages reader discussion on our stories. User feedback, insights and back-and-forth exchanges add a rich layer of context to reporting. Please review our Community Guidelines before commenting.