As online sexual exploitation of children increases throughout the pandemic, the EU has to navigate a delicate path between privacy and child protection in an increasingly sophisticated digital environment.
London (Brussels Morning) From the US, UK, East Asia and Australasia, authorities have detected a rise in online sexual exploitation among children and young people. In navigating the narrow line between privacy and protection, however, all eyes have been on Europe.
After a reported surge in online sexual exploitation of children online, many have pinned down to the pandemic, an e-Privacy directive that came into force on 21 December has received increased scrutiny. A temporary derogation to be decided next year could help make concessions so big tech companies can continue monitoring for material amid an environment that has experienced an evolved modus operandi.
On 7 December, MEPs agreed to explore continuing the voluntary detection of child sexual abuse online that tech platforms perform, at least temporarily, until 2022.
Doing so means web-based messaging apps, including on Facebook and Instagram, Voice-over Internet Protocol (VoIP) and chat and web-based email programmes could continue utilising special tools that detect abusive activity and remove it.
The temporary derogation was proposed before the European Electronic Communications Code was due to come into force on 21 December 2020 in all member states. It extends the e-Privacy directive to the type of ‘over the top’ inter-personal communication services described above, which would have stopped tech platforms from being able to perform this type of detection.
Decisions have been stalled until the New Year, however, while law enforcement agencies, including from the US and UK, are concerned the rebooted privacy rules could hinder investigations and see the number of reports drop.
Incidences of online child sexual exploitation are growing worldwide; activities that include grooming, live streaming, viewing child sexual abuse material, and coercing and blackmailing children into sexual activity, according to children’s rights organisation ECPAT.
Between January and September this year, the US National Center for Missing & Exploited Children, a global clearinghouse for online child sexual abuse reports, received more than 52 million reports and around 2.3 million involved actors or victims in the EU.
Experts have widely said that the pandemic has increased incidences of online abuse. In September, the Internet Watch Foundation — UK charity responsible for finding and removing images and videos of children suffering sexual abuse from the internet — received a record number of public reports of suspected child sexual abuse material to its hotline.
IWF says its analysts processed 15,258 reports from members of the public in September alone, a 45% uptick on the same month the previous year.
Darknet and mainstream tools
Dalia Leinart, member and former chair of UN the Convention on the Elimination of All Forms of Discrimination Against Women (CEDAW) and professor at Vytautas Magnus University, Lithuania, however, notes that this type of sexual abuse has always been online and of concern, buoyed by the use of the darknet, where it is “much easier for traffickers and clients to meet” and clients can make specific and illicit requests, including watching children with children. Consuming material easier to get away with as everything disappears after it is live-streamed.
Live streaming of abuse has been an emerging phenomenon in the last few years, especially among certain countries — the Philippines, a notable example.
“During the pandemic, there was an anticipation that as the travelling sex offender — people who would typically travel to countries where child exploitation and commercial exploitation is taking place on the ground — could no longer travel, that exploitation is being pushed online. There is more online live streaming of child abuse and seeking out of contact with children online”, says Amy Crocker, a programme adviser on online child sexual exploitation at ECPAT.
Offenders and people who want to hide their tracks take advantage of various tools, from encryption, anonymization, virtual private networks, and the darknet. With the trend towards end-to-end encryption, regular apps like Whatsapp — and planned in the future, Facebook messenger app — offenders don’t require specialised tools, says Crocker, because mainstream tools allow downloading and sharing of child sexual abuse material in encrypted environments.
Mobile phones give children access to a landscape of apps for them to go online, whether that’s Instagram, Facebook, or TikTok. In some regions, in Africa or Asia, children may be going online and contacted through internet cafes, explains Crocker.
IWF now warn that self-generated material is on the rise, constituting 44% of all the child sexual abuse content reported to the organisation in the first six months of 2020, 15% more than in 2019.
Self-generated content is taken by the child themselves, often in their own room, using a webcam and then shared online, although they may have been groomed or blackmailed into performing the action.
Susie Hargreaves OBE, CEO of IWF, said parents need to have frank discussions with their children about the internet and the potential dangers of being online.
“If a child is unsupervised, and has a device with a camera and the internet, there is a possibility that, very quickly, they could be groomed and coerced by these online predators and criminals”, she said.
“You may think your child is safe in their bedroom, but even there, they may have been approached by a predator. From there, they can be blackmailed, coerced, or bullied into making videos of themselves for these criminals”.
Enforcement online
A lack of consistent data means a comprehensive global picture of where abuse takes place or where people access this material does not exist. Typically, there is better data relating to Europe and North America than other regions.
“Essentially, wherever kids are going online, we’re seeing the problem. Behaviours of seeking out child sexual abuse material is being seen all over the world, but it is dependent on the local environment.
“The Netherlands is hosting one of the largest amounts of child sexual abuse material, a function of two things. One being they have a huge hosting infrastructure, and two being the challenges with enforcing ISP takedown of content. That’s now changing”.
According to IWF’s 2019 annual report, the Netherlands hosted 71% of the child sexual abuse content found by the charity — 93,962 URLs — increasing from 47% the year before.
Leinarte says the EU and the European Parliament have tried to put the onus on big tech to take more responsibility for how its platforms are being used, but this has not worked entirely.
Many claim the e-Privacy directive will prevent the algorithms and programmes like PhotoDNA from being deployed to auto-scan abusive material. Facebook has already announced it has stopped scanning for child sexual abuse on its messaging platforms in the EU and UK because of the law.
The current debate and deciding factor on how jurisdictions move forward on this issue is on the balance between privacy and protection.
“Our position and the position of organisations we’re working with is that this is a rights-based issue”, says Crocker. “Everyone including children has the right to privacy, absolutely. But children also have the right to protection from exploitation and from violence. We can’t polarise this debate by saying it has to be one or the other, we have to look for solutions that allow for both.
“The EU can and should be leading the way on this”.
MEPs have discussed what an allowance for tech companies would look like, so they can continue using scanning, hashing, and AI, for example, to analyse and detect text, traffic, child sexual abuse material and grooming. European lawmakers have suggested all data be erased immediately, for instance, and in only confirmed cases store the data for use by law enforcement up to three months.
They have also suggested technology should not understand the substance of the content but simply detect patterns. Confidential and professional interactions would, however, not be permitted to be subject to interference.
“This legislation should not be interpreted as prohibiting or weakening end-to-end encryption, MEPs underline, and this derogation should not be extended to include audio communications”, read a statement by the Parliament after ongoing negotiations for a temporary derogation was agreed.
After the vote, rapporteur Birgit Sippel (S&D, DE) noted, “Child sexual abuse is a horrible crime and we have to get better at preventing it, prosecuting offenders and assisting survivors, both online and offline. Parliament, therefore, wants existing legal scanning practices to continue being used for online child sexual abuse material. However, the Commission has failed to provide basic information about additional technologies they wish to legalise, without knowing if they even exist in the EU”.
Leinarte says there is also a bigger picture to combating online sexual exploitation and what she calls human trafficking, albeit the necessary ‘movement’ fundamental to definitions of trafficking now occurs digitally.
Many states are advocating for stemming human trafficking while not addressing the demand from clients — “It’s one of the biggest impediments” to combating the crime, she says
Data and reporting is another issue, says Leinarte, as international organisations like the Organization for Security and Co-operation in Europe and the UN Office on Drugs and Crime may be producing robust reports from different states, but there is no mechanism for implementing their own guidelines.
CEDAW, which recently issued a general recommendation to crack down on the trafficking of women and girls in the digital age, has a reporting mechanism in place but does not produce research or data.
Cooperation is ultimately key, says Leinarte, “no one state or organisation is so powerful and so advanced that they could organise or combat trafficking in human beings alone”.