Skip to main content

The European Union must show leadership in tackling spiralling problem of online child sexual exploitation

The 2021 theme for Safer Internet Day is “Together for a better internet,” and it calls for “all stakeholders to join together to make the internet a safer and better place for all.” This is a crucial message for both the European Union (EU) and private tech companies as right now, millions of children globally are at increased risk of online child sexual exploitation and abuse because of a lack of clarity about the new ePrivacy Directive introduced by the EU in December 2020.

The aim of the Directive is to protect private online communications from being monitored by internet firms. However, it has caused widespread ambiguity over voluntary detection by making it “illegal” in Europe for big tech companies such as Facebook and Microsoft to use the automatic detection tools commonly utilized to identify and remove images of child abuse, and to detect online grooming. 

These types of tools, such as Microsoft’s PhotoDNA and Google’s CSAI Match, use image hashing, classifiers, and anti-grooming applications, and the vast majority of the world’s child sexual abuse material is identified and reported in this way.

The harm arising from the Directive is two-fold. Perpetrators may now find it easier to groom and sexually exploit children without being detected. In parallel, it has become harder for law enforcement to investigate crimes and provide protection to victims and those who are vulnerable.

The negative impact is already being felt. Eurochild and Missing Children Europe reported an alarmingly 46% drop in child sexual abuse content reported since the European Union’s new ePrivacy Directive came into force in the EU on December 21, 2020.  

The growing problem of online child sexual exploitation and abuse

Online child sexual exploitation and abuse is a borderless crime, with sex offenders in Europe able to use social media platforms to contact children around the world. Troublingly, Europe has become the epicenter for harmful content, with the Internet Watch Foundation’s 2019 Annual Report finding 89% of known URLs containing child sexual abuse hosted in European countries, up from 79% in 2018.

On Europol’s database, there are over 51 million unique images or videos containing child sexual abuse. Adding to this repository relies on tech companies using technologies that identify harmful content, as well as screening and blocking offenders uploading and sharing abuse material. Furthermore, analysis by Europol of emerging trends in internet-enabled criminality expressed concern about the growing sophistication and “organized criminalization” of child sexual abusers, and the burgeoning of for-profit production.

Worryingly, the impact of COVID-19 has fuelled a sharp increase in sexual exploitation and abuse online, facilitated by both children and adults spending an unprecedented amount of time logged on. Compounding the problem is an exponential growth in the volume of digital content being produced, making it even harder to locate illegal activity.

In September, INTERPOL stated that since the start of the pandemic there has been an increase in both the sharing of child exploitation material through peer-to-peer networks and a fall in reporting about child sexual abuse. And in a 2020 survey by tech company NetClean, 64% of around 500 police officers from 39 countries reported an increase in online child sexual abuse crimes.

Meanwhile, as a consequence of quarantine measures and homeschooling, referrals about the abuse that would normally come from a variety of sources – such as teachers, health visitors to school nurses – dropped in 2020. 

EU regulations are failing to protect children

Over the past year, voices across Europe have raised concerns about this spiraling problem. In June, the EU’s Commissioner for Home Affairs, Ylva Johansson, spoke of how Europe has become “the global blackspot in hosting child sexual abuse images,” and the scale of the problem was highlighted in the European Commission’s child sexual abuse strategy, which noted that demand for child sexual abuse material has increased by as much as 25% in some member states.

To protect children and bring perpetrators to account, the EU has proposed to adopt measures that ensure stronger cross-border cooperation between law enforcement in different countries. The European Parliament and the Council of Europe have called for more concrete action, and in July 2020 announced the EU strategy for a more effective fight against child sexual abuse. This laid out minimum rules for governments on defining criminal offenses and sanctions on child sexual abuse, both online and in-person, and introduced provisions to strengthen the prevention of crimes and protection of victims.

Privacy rights have also been at the forefront of concerns amongst European leaders, and to protect the digital rights of online users the EU has been developing regulations to clarify the role and liability of technology companies and platforms.

Privacy activists and some European lawmakers have argued that automatic scanning of digital content, including for chat and messaging apps, is a major infringement of people’s fundamental right to privacy and violates privacy and data protection rights.

Legally binding for member states, a draft EU Digital Services Act was released in 2020, providing rules on dealing with disinformation, transparent advertising, and illegal content, and in December the EU passed the Electronic Communications Code Directive. This aims to harmonize EU legal frameworks for electronic communications and regulate the telecommunications sector throughout the EU. Objectives include providing an improved level of consumer protection and maintaining the security of networks and services.

The Code is complemented by other directives and regulations, including the e-Privacy Directive, which applies to email, internet phone calls, instant messaging app, and personal messaging provided through social media, as well as to traditional telecom providers.

Both the Code and e-Privacy Directive expose the difficulties in juggling competing needs. Fundamental questions arise about how to balance safeguarding an open internet and the protection of digital rights – which incorporates basic human rights regarding privacy and freedom of expression – against the need to have limitations that protect online users, including children, from abuse.

The reaction from tech companies to the new regulations is mixed. Facebook handles a staggering volume of private messages via its own messenger service as well as from WhatsApp and Instagram, which it owns. Between July and September 2019 alone, Facebook removed 11.6 million images of child sexual abuse, including 754,000 from Instagram. As soon as the e-Privacy Directive was passed, the company switched off some of its child abuse detection tools in Europe, stating it had no choice as automatic scanning of private messages is now banned.

Other tech firms such as Microsoft, Google, LinkedIn, and Roblox have not made such changes, arguing that the most responsible approach is to keep the technology functioning while EU policymakers work to address the situation and develop a harmonized regulatory approach. However, despite their continued action, the impact of the Code remains potentially devastating and will set back the EU’s own efforts to address online sexual abuse and exploitation.

International cooperation is needed

The effects will be felt far beyond Europe, as the tools are used to identify child sexual abuse materials located in countries across the world. Tech companies also report to the US-based National Center for Missing and Exploited Children (NCMEC), and its CyberTipline received a colossal 69 million files in 2019.

Whilst the reliance on voluntary actions by tech companies is making a substantial impact in detecting and removing illegal and harmful content, tech companies cannot be left to make their own rules. Clear laws and policies are needed to place obligations on companies to moderate, detect and remove illegal content. Laws that are responsive to and predictive in nature can help ensure that online service providers can be trusted with the data that they collect, process, and store.

The global ramifications of EU law demonstrate the need for a multi-jurisdictional approach. Governments and law enforcement agencies have to cooperate to bring offenders to account and there is a strong case for developing international laws and standards that provide uniformity in laws, legal clarity, and agreed penalties for the online exploitation and abuse of vulnerable people.

The pressing urgency for the EU to take decisive action cannot be overemphasized. There is some hope that the situation will be resolved as the EU has resumed discussions about solutions to enable tech companies to continue protecting children online.

Negotiations on the final text of the Temporary Derogation of the ePrivacy Directive – which would provide a legal basis for the use of online tools to detect child sexual exploitation and abuse – were scheduled to finish on 26 January, but the EU has yet to comment publicly about the outcome. 

There is a pressing need for the EU to put an end to the regulatory limbo and confusion and for national and regional efforts to progress at a quick pace. Ultimately, the goal is for all governments to formulate, agree, and enact international standards and implementation that provide protection and redress for children globally, and hold perpetrators to account wherever they are in the world.