Belgium, (Brussels Morning Newspaper) The main objective of the prospective regulation is to prevent and combat child sexual abuse on the Internet. I have been actively fighting for the children; however, I am confident that we must choose the right instruments. The current proposal contains articles that open up the possibility of ending privacy on the Internet in the name of protecting children. The method discussed is not only inefficient but it can have far-reaching consequences jeopardizing everybody’s privacy and even children’s safety.
Protecting children with the right tools
After the pandemic, children and teenagers spend even more time online, which increases their exposure to dangers. The position of the European Union and its legal framework must reflect this.
The proposal lacks enough efficient methods to prevent and combat child sexual abuse. I am actively advocating for adding more. One idea I want to particularly accentuate is to have links, leading to resources helping children who might feel threatened to be subject to abuse, reachable by one click. In other words, I am proposing to add very large online platforms regularly frequented by children a reference to help lines and relevant NGOs. This would enable them to ask for help simply and in a matter of seconds.
Other powerful ways of facing the problem are introducing Internet literacy courses to the curricula on all levels, raising awareness about the issue, and improving the functioning of hotlines and helplines including through funding and capacity building. We also need to ensure that there are effective intervention programs available for people who fear that they might offend and improve the training and coordination of the professionals in close contact with children so they better help the victims and effectively identify the predators.
Unproductive and dangerous privacy violation
All of the methods I have just described can play a decisive role in creating a safe space for children. The idea of snooping on everybody helps them very little. Scanning and processing all users’ data by machine learning in a hope of catching sexual abusers is simply a terrible idea.
Firstly, the artificial intelligence used to screen the inputs is far from reliable. For example, the Swiss Federal Police admitted that 87% of the cases reported by machine learning are irrelevant. Nevertheless, once content is identified as potential sexual abuse, there are far-reaching consequences. It means that private data, such as a photo of a child on a beach, get into the hands of public authorities and becomes a matter of investigation. Moreover, the quantity of such cases overwhelms the capacity of the center to check the content. This implies that the real cases of sexual abuse will not receive the much-needed attention.
Therefore, I want to highlight that people would lose privacy in the name of protecting children without actually helping them. However, once the providers would have the obligation to screen personal data, it would put many people at risk. For human rights defenders or marginalized groups, respect for anonymity on the Internet is absolutely crucial.
Furthermore, I believe that privacy is everybody’s basic right. Once it will be lost, however noble the cause, it will be lost forever.
Therefore, I call for abandoning the idea of using unreliable and for this purpose overly intrusive artificial intelligence and instead ask for a regulation that will actually defend children from sexual predators.