New rules aimed at keeping children away from harmful online material have taken effect in the United Kingdom. The measures apply to websites and apps that display content involving pornography, violence, or subjects like suicide, self-harm, or eating disorders. Companies operating these services are now required to check users’ ages through approved methods such as credit card verification or facial image analysis.
The law assigns enforcement responsibilities to the country’s media regulator. Platforms that don’t follow the rules may face fines of up to £18 million or 10% of global revenue, depending on which amount is higher. If companies ignore official information requests, senior managers may face legal consequences.
The requirement follows the 2023 Online Safety Act, which outlined duties for digital platforms to reduce harm for both children and adults. After a preparation period, the enforcement phase has started. Regulators have confirmed that thousands of adult websites are now using age checks. Social media platforms are being monitored for compliance with the same standards.
Recent findings from the regulator show that about half a million children between the ages of eight and fourteen viewed online pornography in the last month. The figures have drawn concern from child protection groups and public officials. The changes are intended to reduce the chances of similar exposure going forward.
While some gaps in enforcement remain, the introduction of mandatory checks is seen as a shift toward a more controlled online environment for minors. The aim is to create fewer pathways for children to reach dangerous or inappropriate content.
Additional measures are being considered. Officials have mentioned the possibility of setting time limits for how long children can spend on social apps each day. Any future changes will be introduced through separate decisions or legislative updates.
Digital platforms are now expected to meet technical and procedural requirements to show they are protecting young users. Oversight will continue as the regulator reviews how well the new rules are being followed.
Notes: This post was edited/created using GenAI tools. Image: DIW-Aigen.
Read next: Financial Cybercrime Risks Vary Sharply Across U.S. States, Report Finds
The law assigns enforcement responsibilities to the country’s media regulator. Platforms that don’t follow the rules may face fines of up to £18 million or 10% of global revenue, depending on which amount is higher. If companies ignore official information requests, senior managers may face legal consequences.
The requirement follows the 2023 Online Safety Act, which outlined duties for digital platforms to reduce harm for both children and adults. After a preparation period, the enforcement phase has started. Regulators have confirmed that thousands of adult websites are now using age checks. Social media platforms are being monitored for compliance with the same standards.
Recent findings from the regulator show that about half a million children between the ages of eight and fourteen viewed online pornography in the last month. The figures have drawn concern from child protection groups and public officials. The changes are intended to reduce the chances of similar exposure going forward.
While some gaps in enforcement remain, the introduction of mandatory checks is seen as a shift toward a more controlled online environment for minors. The aim is to create fewer pathways for children to reach dangerous or inappropriate content.
Additional measures are being considered. Officials have mentioned the possibility of setting time limits for how long children can spend on social apps each day. Any future changes will be introduced through separate decisions or legislative updates.
Digital platforms are now expected to meet technical and procedural requirements to show they are protecting young users. Oversight will continue as the regulator reviews how well the new rules are being followed.
Notes: This post was edited/created using GenAI tools. Image: DIW-Aigen.
Read next: Financial Cybercrime Risks Vary Sharply Across U.S. States, Report Finds
