On 11 May this year, the European Commission published a draft law “on the protection of children”. In fact, behind this vague wording lies the plan to make internet platforms and especially chat & messaging services responsible for checking messages sent by users for potentially child-endangering content. If such content is identified, the incident is to be reported immediately to the security authorities. [1, 2, 7]
However, the contents of a message or files can of course only be checked for potentially punishable expressions, intentions or visual material if they are unencrypted. [8]
Surveillance through your own smartphone
The measures planned in the draft law would mean that all digital communications of all EU citizens could be monitored automatically and without any prior suspicions. The criticism from civil rights organisations is devastating. For example, the German NGO “Chaos Computer Club” sees this as an “excessive and misguided surveillance method” which is “ineffective on top of that”. Similar statements are also published by the GFF, Epicenter, EDRi as well as the EFF. [2, 3, 4, 5, 6, 7]
There is no doubt that the bill has the potential to change the way we communicate with each other and the extent to which we can trust the technical means used to do so. The bill is also in direct contrast with the general right to privacy of communications and the drive towards widespread use of secure and comprehensive encryption. [2, 6, 9]
This is because the implementation of such content filters always presupposes the softening, if not the complete removal, of secure encryption methods at crucial points of communication. Contrary to the intention of the draft law, this may even endanger the protection of the child’s well-being itself. [2, 6, 8]
Error rates and false allegations
This approach is very reminiscent of the “upload filters” for social media and video platforms (or general public communication on the internet) that came into force in 2021. Although this was a reform of copyright law, one of the main points of criticism was that automated content filters do not (or cannot) function without errors. [11]
Today, many people communicate digitally to a considerable extent. Billions of messages and files are sent in the EU every day. The monitoring of all this content is therefore already a purely practical problem for platforms, which must be carried out with sophisticated technical means. But regardless of whether artificial intelligence or less complex options are used for this purpose, none of them works completely flawlessly or without misclassifications. [2]
Since discovered content is to be reported immediately to security authorities, even the smallest quota of falsely recognised content far below the per mille limit can lead to a sheer flood of reports and hopelessly overload the authorities. Furthermore, content of private communication would be forwarded to state authorities without it containing actual legal violations and thus – supposedly legitimate – interest of the authorities. [2]
In view of the freedom of the press and the protection of whistleblowers, this kind of state surveillance poses a serious threat to democracy. The so-called “chilling effect” can also occur and have a lasting impact on our social understanding and trust. [9]
Resistance from public society
Civil rights movements agree that the protection of children cannot be improved by mass surveillance of all digital communications. However, these organisations are so far the only ones to bring the criticism, which also finds resonance within the EU Parliament, to the public. Both the effective protection of children and the preservation of secure and private communications within the European Union require broad public discourse. The two goals are not opposing poles, but can only be achieved together and in harmony with one another. [10]
Sources
- Europäische Kommission (2022): Pressemitteilung – Kommission präsentiert Gesetzesvorschlag zum Schutz von Kindern. URL: https://ec.europa.eu/commission/presscorner/detail/en/ip_22_2976
- Neumann, Linus (2022): EU-Kommission will alle Chatnachrichten durchleuchten. URL: https://www.ccc.de/de/updates/2022/eu-kommission-will-alle-chatnachrichten-durchleuchten
- Reda, Felix (2022): Chat control: Filter technology endangers fundamental rights. URL: https://freiheitsrechte.org/ueber-die-gff/presse/pressemitteilungen-der-gesellschaft-fur-freiheitsrechte/pm-chat-control
- Schmidt, Petra (2022): Chat Contol: European Commission launches direct attack on privacy. URL: https://en.epicenter.works/content/chat-contol-european-commission-launches-direct-attack-on-privacy
- EDRi (2022): Chat control -10 principles to defend children in the digital age. URL: https://edri.org/our-work/chat-control-10-principles-to-defend-children-in-the-digital-age/
- Mullin, Joe (2022): The EU Commission’s New Proposal Would Undermine Encryption And Scan Our Messages. URL: https://www.eff.org/deeplinks/2022/05/eu-commissions-new-proposal-would-undermine-encryption-and-scan-our-messages
- Reuter, Markus (2022): Geleakter Prüfbericht geht mit Chatkontrolle hart ins Gericht. URL: https://netzpolitik.org/2022/eu-kommission-geleakter-pruefbericht-geht-mit-chatkontrolle-hart-ins-gericht/
- Portnoy, Erica (2019): Why Adding Client-Side Scanning Breaks End-To-End Encryption. URL: https://www.eff.org/de/deeplinks/2019/11/why-adding-client-side-scanning-breaks-end-end-encryption
- Reuter, Markus (2021): Angriff auf unsere private Kommunikation. URL: https://netzpolitik.org/2021/chatkontrolle-angriff-auf-unsere-private-kommunikation/
- Reuter, Markus (2022): Europas digitale Bürgerrechtsorganisationen gegen neue Form der Massenüberwachung. URL: https://netzpolitik.org/2022/chatkontrolle-europas-digitale-buergerrechtsorganisationen-gegen-neue-form-der-massenueberwachung/
- Jennissen, Tom (2021): Uploadfilter werden Gesetz. URL: https://netzpolitik.org/2021/urheberrechtsreform-uploadfilter-werden-gesetz/
Jan is co-founder of ViOffice. He is responsible for the technical implementation and maintenance of the software. His interests lie in particular in the areas of security, data protection and encryption.
In addition to his studies in economics, later in applied statistics and his subsequent doctorate, he has years of experience in software development, open source and server administration.