European Union member states have reached a common position on draft legislation aimed at strengthening online child protection, stopping short of requiring global technology companies to identify or remove child sexual abuse material. As per Reuters, the announcement was made Wednesday by the European Council.
The new Council text differs from a 2023 European Parliament proposal, which would have mandated that messaging services, app stores, and internet providers report and remove known and newly detected abusive content, including grooming materials. Under the Council’s draft, providers are now required to assess the risks of their services being used to disseminate such material and implement preventive measures where necessary. Enforcement is delegated to national authorities rather than the EU.
Companies could still voluntarily check for abusive content "beyond April next year", when current online privacy exemptions expire. The legislation also establishes an EU Centre on Child Sexual Abuse, designed to support member states in compliance and provide assistance for victims.
The Council’s approach has been described as less prescriptive than earlier proposals, focusing on risk assessment rather than compulsory monitoring or scanning. Some critics have raised concerns that allowing companies to self-assess could have implications for privacy and encrypted communications.
The European Parliament has separately called for minimum ages for children accessing social media, but no binding legislation on this issue currently exists.
EU member states must now finalize details with the European Parliament before the regulation can become law.
Notes: This post was drafted with the assistance of AI tools and reviewed, edited, and published by human. Image: DIW-AIgen
Read next: Gen Z Eschews Career Advisors as ChatGPT Becomes Their Go-To for Academic Advice, Study Shows
The new Council text differs from a 2023 European Parliament proposal, which would have mandated that messaging services, app stores, and internet providers report and remove known and newly detected abusive content, including grooming materials. Under the Council’s draft, providers are now required to assess the risks of their services being used to disseminate such material and implement preventive measures where necessary. Enforcement is delegated to national authorities rather than the EU.
Companies could still voluntarily check for abusive content "beyond April next year", when current online privacy exemptions expire. The legislation also establishes an EU Centre on Child Sexual Abuse, designed to support member states in compliance and provide assistance for victims.
The Council’s approach has been described as less prescriptive than earlier proposals, focusing on risk assessment rather than compulsory monitoring or scanning. Some critics have raised concerns that allowing companies to self-assess could have implications for privacy and encrypted communications.
The European Parliament has separately called for minimum ages for children accessing social media, but no binding legislation on this issue currently exists.
EU member states must now finalize details with the European Parliament before the regulation can become law.
Notes: This post was drafted with the assistance of AI tools and reviewed, edited, and published by human. Image: DIW-AIgen
Read next: Gen Z Eschews Career Advisors as ChatGPT Becomes Their Go-To for Academic Advice, Study Shows
