AI-surveillance won’t save your kids: Chat control rearing its head again

More snake oil lubricating new attempt at mass surveillance in EU

Author: Casey Ford
12 mins read
Privacy-1.svg

The specter of mass surveillance continues to loom over Europe. Despite all the advancements made by the General Data Protection Act (GDPR), getting around data encryption is still a target.

Unfortunately it is all too common for violations to our civil liberties to clothe themselves in promises of protecting us from monsters. The EU proposal for the “Regulation laying down rules to prevent and combat child sexual abuse” (CSA Regulation, or CSAR) is a stunning case in point. The regulation was once again up for vote in the European Parliament this week. Thankfully it was ultimately postponed yesterday due to the controversy surrounding what privacy advocates are calling possibly “the most criticised draft EU law of all time.”

Screenshot taken on 21 June 2024

CSAR essentially provides web clients with AI-powered surveillance technology to undermine the encryption protecting everyone’s personal files and communications. This is all in the name of combatting child sexual abuse, but don’t be fooled by this wolf in sheep’s clothing. EU citizens are being given a false choice: private communications or sexual predators? Fortunately, many, including those in the Nym core team and community, are speaking out urgently against it.

The Nym team has been closely following the creeping advancement of versions 1.0 and 2.0 of “Chat control” since its inception in 2022. It is once again seeking a discussion and possible vote with deceptively revised language. However, the changes simply dampen the tone of its mass surveillance strategy. This is all just a sleight of hand to try to bypass GDPR commitments to protect end-to-end data encryption. While encryption may be left untouched by CSAR, a more ominous backdoor on encryption is proposed which would carry out an even deeper device surveillance en masse.

Below is what is at stake in the CSAR proposal and why it is such a threat to the online privacy of everyone in the EU and globally.

What is chat control?

The CSAR legislation, which has come to be known by critics as “chat control,” is a regulatory attempt to locate and block the traffic of child pornography and abuse “prior to transmission.” While few should disagree with the objective, experts have been shouting from the rooftops that the proposed regulation will simply not be effective. Worse, it authorizes and enforces a surveillance backdoor into the devices of a whole continent of users.

Client-Side Scanning (CSS)

CSAR’s proposed solution to the problem of child sexual abuse material (otherwise called CSAM) is mass surveillance through Client-Side Scanning (CSS). Rather than compromising encrypted transmissions, it essentially scans data on devices or servers before encryption occurs. This would undermine the security of communications and the principles of end-to-end encryption for everyone.

Encryption is the cornerstone of secure communication. When you do something online through an encrypted connection — sending personal photos to friends or family, or using your credit card online — you can rest reasonably assured that outside parties cannot access or exploit your data. Original CSAR efforts hoped to compel web services to break the encryption protecting the contents of clients’ communications. With encryption broken, all messages could be scanned for targeted material. Fortunately this was met with a huge backlash from civil society and privacy advocates.

What is currently being proposed are CSS technologies which would legally permit and enable web services to scan or “monitor” the files and communications on your device before encryption is even applied. Imagine that Apple is able to regularly scan your iPhoto or iMessenger apps for pre-programmed data points deemed to be a threat. Or that a private communications service is given the legal order to do so. In fact, Signal’s president has even publicly stated that they will withdraw their services from EU jurisdictions rather than allow such a compromise on the security of their users.

Screenshot taken on 21 June 2024

CSS technology “deeply flawed,” scientists say

Any surveillance scanning program needs to search for something particular. The big question is: can it discernibly find its targets without hauling in unrelated data? In the case of CSAR, the stated goal is to scan users’ devices for potential instances of CSAM. This means, for example, photos depicting children nude or sexually, or messages describing such acts. But how would this even work exactly?

Essentially a data analysis program likely powered by AI will be given access to everyone’s unencrypted content on a device or server. The system will flag mass amounts of potential threats based on data points before sending all cases to a human team for verification. This team will then have access to your personal files and communications to determine whether you’ve done anything wrong. Suspicious content or activity will then be forwarded to law enforcement authorities for further investigation.

Researchers have been very vocal in warning of how error-prone this technology can be, both technically-speaking and as a detriment to civil liberties. Imagine a false-positive, where a parent’s personal photos of their children are flagged by an algorithm and then viewed and reviewed by analysts before being passed on to law-enforcement. Data scientists have warned that such “scanning is doomed to be ineffective,” and done on a large scale will create “side-effects that can be extremely harmful for everyone online, and which could make the Internet and the digital society less safe for everybody.”

Timeline

CSAR has been struggling through the EU Parliament since its introduction in 2022. It is back this week with a strategically scheduled discussion under the radar of the recent parliamentary elections. Here’s the key timeline to see its advance, changes, and current status.

  • 11 May 2022: The original CSAR proposition is introduced.
  • June 2022: It was revealed that Hollywood actor Ashton Kutchner had been working with Home Affairs Commissioner Ylva Johansson lobbying for the then called CSAR, to the benefit of Thorn, an organization he co-founded. Thorn sells the AI tools that would be used to scan people’s devices. (Kutchner later stepped down as chairman of Thorn after defending a convicted rapist).
  • 13 April 2023: A report commissioned by the European Parliament on the impact of the CSAR proposal was presented to the EU Parliament’s Committee on Civil Liberties, Justice and Home Affairs.
  • July 2023: Open letter from 465 scientific and research signatories is sent to the EU commission. Scientists warn explicitly against the likely inefficiencies of the proposed technologies and their undermining of “a secure digital future for our society” and “democratic processes in Europe and beyond.”
  • 2 May 2024: New open letter from 312 scientists and researchers in 32 countries is sent to the EU. They reiterate that the language changes to the CSAR proposal will not change “deeply faulty” and dangerous technology. The letter is co-authored by Bart Praneel, privacy advocate, professor, and Nym Tech advisor.
  • 13 February 2024: Ruling by the European Court of Human Rights in Podchasov v. Russia emphasizes the role of end-to-end encryption for “securing and protecting the privacy of electronic communications” and insuring “the enjoyment of other fundamental rights, such as freedom of expression.”
  • 20 June 2024: Discussion and vote on revised CSAR is scheduled after postponement (see item 28 under “Justice and Internal Affairs”). Vote is finally cancelled!

It’s important to note that the CSAR document is a legislative proposition and has no legal effect unless approved by the parliamentary body. Nonetheless, it is important for EU citizens to contact their representatives and protest against these efforts to violate civil liberties.

Legislative language games

How things are expressed in language can have significant effects on us: a luring turn of phrase can make us want something we probably don’t, or turn us off to something that’s really not that bad. Sometimes, phrasing can be so monotonous and abstract that we don’t even notice that what we’re digesting a poison pill.

Enter the CSAR’s latest proposal for “technologies for upload moderation.” When reading through hundreds of pages of parliamentary regulations, phrases like this can easily go unnnoticed, which is exactly what they are designed to do. So what does it mean exactly?

Screenshot taken on 21 June 2024

Let’s break this down.

[1] “Providers of interpersonal communications services” means any web service that handles your digital communications with other people. This is a broad scope. It clearly applies to messaging apps like WhatsApp, even privacy-focused and encrypted ones like Signal. But it also applies to larger companies like Apple which provide photo and messaging services, let alone operating systems.

[2] The legal order of “shall install and operate” this CSS technologies entails that communications services will not only will be able to do so, but that they must (or “shall”).

[3] This is the crux of the matter: detecting the object of surveillance [4] “prior to transmission” means before encryption. Once encryption is in place, detection of any sought after digital material would require breaking encryption, which is not on the table. So this technology amounts to mass device scanning, which only device providers or app and web service operators can perform on behalf of authorities. Since the goal of encrypted data is to protect against third-party access, this technology is essentially a backdoor to encryption.

[4] While the goal of this scanning might well be the detection of CSAM, locating it means that the AI programs first have access to all of our data or communications. First, there is a big difference between “known child sexual abuse material” (for instance, a photo or video already identified by authorities) and “new child sexual abuse material.” With the latter, the capacities of any AI-program are capable of making tremendous errors in distinguishing between child pornography from private family photos with no sexual intent.

So “upload moderation,” which sounds as innocuous as “making sure people only post a reasonable amount of material online,” actually means “AI-enabled data surveillance on all users’ devices before encryption for law enforcement purposes.” Add “untested” and “scientifically improbable and error-prone” to this “technology,” and we’re closer to the truth.

Breaking encryption or taking it from behind?

A lot of the backlash against CSAR has revolved around whether it “breaks” end-to-end encryption for data transmissions. Data encryption was protected by the GDPR and recently reinforced by the recent European Court of Human Rights decision in Podchasov v. Russia (2024). As Meredith Whittaker, the President of Signal, has noted, this amounts to nothing more than a play on language or “rebranding”:

“[M]andating mass scanning of private communications fundamentally undermines encryption. Full stop. […] We can call it a backdoor, a front door, or ‘upload moderation.’ But whatever we call it, each one of these approaches creates a vulnerability that can be exploited by hackers and hostile nation states, removing the protection of unbreakable math and putting in its place a high-value vulnerability.”

By enabling web services who facilitate our data to scan it all on our devices or on their servers before any encryption is applied during transmission is not technically “breaking encryption.” In fact, it’s much worse: a backdoor for automated AI algorithms to serve an intermediary surveillance role in connecting our personal data with authorities.

The CSAR proposition has much contradictory praise for the value of end-to-end encryption:

“[E]nd-to-end encryption,” the CSAR admits, “is a necessary means of protecting fundamental rights and the digital security of governments, industry and society.”

And yet what it is seeking to create would explicitly undermine the same data privacy that data encryption is meant to protect. The surveillance or “monitoring” technology, the proposition makes clear (26a),

“ensures that the detection mechanism can access the data in its unencrypted form for effective analysis and action, without compromising the protection provided by end-to-end encryption once the data is transmitted.”

How surveillance can “access the data in its unencrypted form […] without compromising” end-to-end encryption is a brazen piece of bullshittery.

Another big change in the CSAR proposal is that users must be asked to consent to these surveillance technologies being used:

“child sexual abuse material should remain detectable in all interpersonal communications services through the application of vetted technologies, when uploaded, under the condition that the users give their explicit consent under the provider’s terms and conditions.”

What does this mean exactly? Any user that does not give their consent to having their data scanned can still use the web service, as long as it “does not involve the sending of visual content and URLs.” This can effectively render many communications services inoperable. So imagine that you want a private messaging app, but the EU forces them to scan your data beforehand, and you refuse for good reason. Since the app also allows the sending of photos and links, you simply can’t use the service.

Conclusion

The vote on CSAR has been postponed, but it is by no means over. The bill will raise its ugly head once again, maybe with further revisions to language that will try to calm the waters to push mass digital surveillance on Europe.

However, what is clear to many is that this sort of mass surveillance will not solve the problem, which is a social problem rather than a technical one. As Bart Praneel, one of Nym’s close advisors, has commented:

"[it's]unclear why some governments keep pushing for this mass surveillance approach rather than focusing on prevention of sexual abuse of children.”

Harm against children should be prevented through funding of social services and other preventative means. Ella Jukubowska and EDri have been researching real alternatives to the problem of CSAM and child abuse, online and off. Abuse should not be used as a cynical PR stunt to break encryption and compromise the privacy and security of the entire European population in the process. And we should not be willingly allowing governments to violate our rights to privacy out of fear for monsters.

Hopefully this latest delay to vote is a sign that regulators are starting to wake up to this fact.

Nym Dispatch

The Nym Dispatch series deep-dives into the online and social privacy ecosystem. Tune in to learn more about the risks and money-making tactics of the online surveillance economy, starting with the free VPN market. The series will soon cover the logging practices of free VPNs; their deliberately vague consent contracts; how, why, and to whom they sell our data; and their invasive advertising practices.

Share

Keep Reading...

Privacy-1.svg

The Nym Dispatch: X blackout in Brazil

VPNs caught in the crosshairs in row over content regulation

11 mins read
NymWorld-1.svg

Botnets and backdoors: The free VPN trojan horse

Free VPNs used in massive botnet that exploited millions of devices

9 mins read
Privacy-1.svg

Who is tracking your internet activity, and why?

Your every move online is being tracked. Decentralized VPNs can better protect our privacy.

20 mins read
VPN-screen.svg

INTRODUCING NYMVPN

Advanced privacy built for the age of AI

Artboard 1.svg