EU Lawmakers Pressurized To Be More Transparent About Child Tech Policymaking

EU officials are currently in the line of fine for not doing enough in terms of being more transparent about controversial tech policies designed to safeguard children.

The reports are putting more pressure on lawmakers to give regulatory bodies greater access to what kinds of surveillance technology and digital messaging were used to highlight material coming under the sexual abuse of children domain, otherwise known as CSAM.

The news comes after one ombudsman revealed details regarding an incident that took place in December. This had to do with poor governance regarding one EU executive who chose not to give out full details about communications linked to a company known for designing child safety in the world of tech.

While the EC did give out some documents about the subject, it did not allow others to access that, raising eyebrows on the limited transparency.

This recommendation comes after a journalist raised concerns on the matter with an ombudsman after they were denied access sent over by Thorn. The latter is an American firm that deals with the sales of AI technology for the detection and deletion of CSAM.

This is why the commission received an urgent request to reconsider the decision of limited viewership so that others could get access to the documents in public as the issue was concerning.

Meanwhile, the commission rolled out its own proposal and stance on this front about how legal frameworks were needed to obligate all digital companies to make use of automated tech for the detection of explicit child content, both new and old. Similarly, it would highlight and report any shady illegal acts like grooming of kids online.

But there has yet to be any work on this front since May 2022. Therefore, the ombudsman is striking it as a serious area of concern that must be considered and further rolled out with great transparency for the public to see in terms of EU lawmaking endeavors.

Only then can the public participate in more effective processes for making decisions that would impact each user’s day-to-day life as it limits privacy. On the other hand, transparency will enable the public to better gauge what led to the legislative proposals in question. After all, critics feel that such acts cannot and should not arise behind closed gates.

Meanwhile, so many critics feel that perhaps something is holding the lawmakers back and that hindrance might have to do with lobbyists marketing child safety tech. The latter benefit from rules that mandate automated checks on CSAM from taking place.

Last year, one seminal was also rolled out by the DPS in Europe where a long list of issues could be ineffective at fighting such child abuse issues and could serve as a serious risk to fundamental freedoms in today’s democratic society.

The Commission would give rise to access to plenty of documents that were flagged as accurate after such pressure. And this comes after recommendations from the Ombudsman. For now, a detailed response on the matter is expected to be generated as early as March of this year.

Photo: Digital Information World - AIgen

Read next: Meta Gears Up For Upcoming US Elections With The Launch Of Its AI Disclosure Tags For Political Ads
Previous Post Next Post