A digital spy in every mobile phone, for your own good

By Dutch science journalist Arnout Jaspers

Shortly after the European Parliament elections, the European Council of Ministers was due to vote on a far-reaching proposal to combat the online distribution of child pornography. It would then still have to be submitted to the European Parliament. However, the Council of Ministers decided not to even put it to a vote, as the proposal is now unlikely to get a majority. Then, ‘Proposal 12611/23’ has not yet been finally withdrawn.

If it is ever implemented, it will amount to every European walking around with a government digital spy in his or her mobile phone from now on. Proponents of this kind of Big Brother surveillance will say that this is grossly exaggerated, because there are limitations to it: that digital spy in its current form only checks that you are not sending child pornography photos or videos via Whatsapp, Telegram or other public communication channels. As long as you don’t, no one is looking into your phone, we are assured. But if you do, your own phone automatically snitches on you to the government.

Whatsapp, Telegram, and recently Facebook Chat, use end-to-end encryption. That is to say: the text and images you send are converted in your phone into secret writing that only your conversation partner can decipher, and vice versa (by means of so-called public key encryption, an ingenious system that allows two parties to communicate securely without having to send each other a secret key first. Without public key encryption, online payments would be impossible).

Properly implemented end-to-end encryption is unbreakable. Neither the government nor the ISP has the decryption keys, and without them, decryption is impossible. The official EU position is that such encryption is a great thing to safeguard the privacy of its citizens. The proposal also explicitly states, that there is no intention to undermine end-to-end encryption.

Terrorism as an excuse

That is the theory; in practice, governments behave as if they find such leak-proof encryption an abomination. Earlier, the US FBI caused widespread uproar because they wanted to force Apple to build a “backdoor” into their operating software that would allow the secret service to bypass the password protection of Apple phones. Really just to be able to look into the phones of arrested terrorists, of course.

Proposal 12611/23 involves the EU giving phone and internet providers a “detection order” for child pornography and online grooming of children. The proposal does not say how those providers should do so, because the drafters do not want to get bogged down in the current state of the art.

Inevitably, however, text and images will be analysed before being sent with unbreakable end-to-end encryption. So that has to be done by software in the phone itself.

Automatic alerting

In 2021, Apple announced on its own initiative that it was going to install such spying software in the operating system of all its phones. Their system was going to pre-scan all photos and videos stored in the iCloud via an iPhone for child pornography. Despite advanced safeguards for privacy, a storm of protest arose among cybersecurity experts and privacy activists. Whereupon Apple shyly shelved the plan.

Apple’s AI system created a kind of digital fingerprint (a neural hash, a 96-digit number) in the phone from every photo or video backed up to the iCloud. That fingerprint was compared with the many millions of fingerprints of already known child pornography in a central database managed by the government. After a minimum number of matches (five or 10, to avoid false alarms as much as possible), an alert automatically goes to the provider and to investigative agencies.

Since no photos themselves are leaked or viewed by a human, Apple seemed to think this was only a small, acceptable invasion of their customers’ privacy, compared to the greater good of catching distributors of child pornography.

Proposal 12611/23 also provides for an ‘EU Centre on Child Sexual Abuse’ with a central database of already known child pornography. In addition, such a detection order should also include the signalling of new child pornography. How this is to be done is not mentioned; AI systems are currently unable to reliably distinguish innocent holiday snapshots of naked children by a pool in the backyard from child pornography. Furthermore, it is impractical to have the many millions of photos uploaded daily in the EU all reviewed by humans, quite apart from the obvious privacy concerns. In fact, the proposal here anticipates hypothetical, better AI systems that can do so with sufficient reliability.

All or nothing

Influenced by all the criticism that had erupted earlier, the proposal had already been toned down slightly: under the fancy euphemism moderated uploading, phone users would be allowed to say ‘yes’ or ‘no’ to the automatic scanning of their images. But of course, that is an all-or-nothing choice: ‘no’ means he or she can no longer back up any photos or videos to the cloud or upload them to Whatsapp. That is not a choice; that is digital extortion.
Even if you think that child pornography is so heinous that it warrants hefty invasions of privacy, the question is what the fight will gain from it. After all, you can perfectly well make images unrecognisable offline with public key encryption and only then upload them via your phone or computer.

So it will not catch the shrewd producers and major distributors of child pornography, at most it will catch some stupid perverts who unsuspectingly share such material. And besides, people will inevitably be wrongly branded by such AI systems as distributors of child pornography. We now have ample experience of what becomes of government promises, that people will never be marked as perpetrators just by an AI system or algorithm.

New applications

Even more dangerous is mission creep, the step-by-step expansion of such a surveillance system. Once such software is standard in the operating system of phones, it is almost a law of nature that police and security services will start inventing new applications.
If a digital fingerprint is made available from every photo or video sent anyway, then without much extra effort you can also match it with a database of images of missing persons, or terrorism suspects, or objects flagged as related to drug trafficking. From there, it is only a small step to matching those digital fingerprints with a database of material flagged as disinformation about, say, the war in Ukraine, or vaccination. All for your own good and safety, of course.

Proposal 12611/23 is now on hold. But such Big Brother surveillance is so attractive to governments and security services, that it is bound to reappear somewhere in modified form. And the technical complexity ensures, that the mainstream media hardly cover such topics. They would do better at the talk show tables.

Originally published in Dutch on Wynia’s Week

Disclaimer: www.BrusselsReport.eu will under no circumstance be held legally responsible or liable for the content of any article appearing on the website, as only the author of an article is legally responsible for that, also in accordance with the terms of use.