Getty ImagesThe Online Safety Act, which aims to make the internet safer for children, became law just under a year ago in October 2023Social media companies will face punishments for failing to keep children safe on their platforms, communications watchdog Ofcom has warned.
Services like Facebook, Instagram and Whatsapp could face fines from the regulator if they do not comply with the new Online Safety Act – which comes into force early next year – Ofcom chief executive Dame Melanie Dawes, told the BBC.
Dame Melanie said it was the responsibility of the firms – not parents or children – to make sure people were safe online.
Companies will have three months from when the guidance is finalised to carry out risk assessments and make relevant changes to safeguard users.
Dame Melanie's comments came on the same day that Instagram added features to help stop sextortion.
Ofcom has been putting together codes of practice since the Online Safety Act became law.
The Act requires social media firms to protect children from content such as self-harm material, pornography and violent content.
However, the pace of change is not quick enough for some.
Ellen Roome Ellen Roome said her son Jools was "a happy, normal child"
Ellen Roome's 14-year-old son Jools Sweeney died in unclear circumstances after he was found unconscious in his room in April 2022. She believes he may have taken part in an online challenge that went wrong.
Mrs Roome is now part of the Bereaved Parents for Online Safety group.
She told the Today programme: "I don’t think anything has changed. They [the technology companies] are all waiting to see what Ofcom are going to do to enforce it, and Ofcom don’t seem to be quick enough to enforce those new powers to stop social media harming children.
"From us as a group of parents, we are sitting there thinking ‘when are they going to start enforcing this?’ They don’t seem to be doing enough.
"Platforms are supposed to remove illegal content like promoting or facilitating suicide, self-harm, and child sexual abuse. But you can still easily find content online that children shouldn’t be seeing."
Dame Melanie said that technology companies needed to be "honest and transparent" about what their "services are actually exposing their users to".
"If we don't think they've done that job well enough, we can take enforcement action, simply against that failure."
OfcomChief Executive of Ofcom Dame Melanie Dawes said firms have a responsibility to keep people safe online
Ofcom has already been in close contact with social networking services and Dame Melanie said when the new legal safeguards became enforceable the regulator would be "ready to go".
She added: "We know that some of them are preparing but we are expecting very significant changes."
Dame Melanie said changes could also include allowing people to take themselves out of group chats, without anyone else being able to see they had left.
The Online Safety Act aims to force tech firms to take more responsibility for the content on their platforms.
Ofcom has the power to fine companies which break the rules up to 10% of their global revenue. It can also block access to their businesses in the UK.
Dr Lucie Moore is the chief executive of Cease, the Centre to End All Sexual Exploitation. She welcomed Dame Melanie’s comments about putting the onus of keeping children safe on the tech companies.
However, she was disappointed by "the lack of clear definition in the plans that Ofcom has drawn up to regulate online harms", specifically on age verification methods regarding pornographic material.
Additional reporting by Graham Fraser