Encryption is among the most important building blocks of the internet today. It makes it possible to transmit all kinds of information securely, no matter how you’re getting online. End-to-end encrypted messages, no matter whichencrypted chat appyou’re using, are particularly important, ensuring the privacy of individuals communicating with each other. Not everyone’s a fan of strong encryption, though, and critics point out how bad actors can take advantage of the privacy they afford, as well, allowing them to discuss and plan crimes and exchange information without a chance to surveil them.

The EU Commission, the European Union’s executive arm, and the US Congress see this issue as a pressing problem that needs to be solved. Both want to propose similar laws in their respective jurisdictions that would basically require most organizations that provide communication, like chatting, emailing, or (video)calling, to scan correspondence on their platform for illegal content, effectively undermining encryption altogether. While the EU’s and Congress’s latest drafts are a little less drastic, backdoors (or security issues) in software don’t care which country you live in, and bad actors can make use of any. The rulings would have an effect all around the world, opening previously inaccessible troves of data to hackers and surveillance services, alike.

A person is holding a Google Pixel 7 Pro, interacting with the Signal chat overview

We’re going over just what the US’s EARN IT and STOP CSAM proposals, as well as the EU’s Chat Control, involve, what problems there are with them, and whether there’s anything you’re able to do about them.

What do STOP CSAM, EARN IT, and Chat Control attempt to do?

The intention behind STOP CSAM, EARN IT, and Chat Control sounds noble enough — at least that’s what’s communicated to the public. The EU proposal is “laying down rules to prevent and combat child sexual abuse.” Itsofficial 2022 proposal text, which is still relevant in most part, claims that “at least one in five children falls victim to sexual violence during childhood,” with studies showing that many of these offenses take place online these days.

The EU makes clear that this is why providers that host communication services play an important role in the protection of children, as they’re in a position to spot and stop abuse, particularly when it comes to sharing child sexual abuse material (CSAM). STOP CSAM and EARN IT follow a similar goal, with the latter initially introduced in 2020 and more broadly focused on crime. Its encryption-abolishing ideas were met with so much backlash that it went back into the drawer until 2022.

CSAR-PR-survey-1024x576

To combat these bad activities, Chat Control, EARN IT, and STOP CSAM propose that big chunks of online communication be monitored via automated filters. On a technical level, these filters are not analyzing users’ images or text messages directly, but rather work on hashes generated from them. Only when these hashes match that of known suspicious data are the next steps are taken.

Another measure proposed by the EU in the aim of protecting children is the introduction of age verification on virtually all major platforms. According tolawyers talking to The Guardian, this could lead to mass surveillance and data collection as platforms need to verify the age of their users somehow, whether that’s using biometric face data or confirmation via electronic ID methods. That would also undermine the right to anonymity, with services effectively only usable after you identify yourself.

Signal could be one of the services affected by the proposals

The latest revision of Chat Control doesn’t seem intent on fully breaking end-to-end encryption anymore, which was a big issue with earlier proposals. Encryption during message transmission is not supposed to be weakened, but there is still the big problem of client-side scanning, which isn’t explicitly excluded from the latest draft. This would mean that rather than checking messages in transit, which wouldn’t be possible using end-to-end encryption, messages are supposed to be scanned on the device of the receiver or scanner after being decrypted. This still technically interferes with the idea of end-to-end encryption, though, given that it undermines the idea that only the intended recipient will ever be able to access the messages.

Meanwhile, STOP CSAM and EARN IT act would make the use of end-to-end encryption much harder to justify for companies. Under the proposed legislation, companies could easily be punished when they use E2EE thanks to the idea that all platforms should deploy scanning methods matching against law enforcement databases. Companies that keep deploying E2EE could bedragged into court over “recklessly” enabling bad actorsto use their platforms for crime, which in turn would mean that no company would dare touch encryption anymore.

There is already some regulation in place under the “Chat Control 1.0” name in the EU, which asks companies to voluntarily scan communications. According toEU parliamentarian and Pirate Party member Patrick Breyer, Google, Facebook and Instagram Messenger, Snapchat, iCloud Mail, and Microsoft Xbox are among the services complying with this optional ruling.

The Chat Control proposal stands in stark contrast to the usual ideas and laws adopted by the EU, which has a reputation as a pioneer when it comes to privacy and data protection. For example, the all-too-well-knownGDPR lawresponsible for cookie banners also gives EU citizens the right to be forgotten online, meaning they can ask companies to delete any and all unnecessary data on them.

Why are STOP CSAM, EARN IT, and Chat Control problematic?

Today’s Chat Control 1.0 and similar rules in the US are already causing issues. As mentioned, one of the companies complying with the EU ruling is Google, which started performing automated sweeps of data uploaded to its servers long before Chat Control. First it was using Microsoft’s PhotoDNA (reporting straight to law enforcement), which recognizes and flags known child abuse pictures, which was later augmented with its own machine-learning tool that can recognize and flag images never seen before.

A high-profile casecovered by the New York Timesshines light on the problems arising with this. In 2021, a Google user’s account was blocked, with years of email and photo history held hostage in it on the grounds that Google spotted “harmful content” on the account. What happened was that the user and his wife took pictures of their son’s genitals to send to a doctor at their request due to concerning swelling, with the user’s hand visible in one of the images. Despite the situation clearing up, with an investigation closed due to this being a false positive, the person’s account remains blocked.

False positives like this have the potential to ruin the lives of those caught in the crosshairs, all while tying up police and law enforcement with chasing down rabbit holes rather than focusing on more effective means to catch actual criminals. According to police studies cited by Patrick Breyer, only about 20% of reports resulting from scanned or hashed data end up proving criminally relevant. Chat Control 2.0 only threatens to make this situation worse, potentially decreasing that signal-to-noise ratio with reports that go nowhere.

That’s not to say that these tools are outright bad — they can and do help catch real criminals. However, companies and the police need to get better at handling false positives. Once an investigation is closed, people should be able to easily get back into their accounts. That’s particularly important in the case of Google, which can store decades of data for you, but it would also be a bummer to lose years of chat history on an instant messaging platform. The right to privacy also needs to be protected, whether you’re in the US or the EU, and all these proposals stand diametrically opposed to that.

In the case of the EU’s idea to weaken end-to-end encryption, another problem arises: once backdoors exist, other actors can also exploit them. There are supposed to be police warrants in place before some of the measures are taken, but let’s face it: intelligence services and some bad actors could probably overrule or ignore some of these principles. The US proposal “solves” this issue gracefully by potentially all but outlawing E2EE.

If the laws come into effect, they may also have negative consequences for the group of people who are meant to be protected by it. Arepresentative EU study among children aged 13 to 17from March 2023 shows that about 80% of them would not be comfortable being politically active or exploring their sexuality online when they know that authorities would be able to monitor their communication.

Chat Control is only supposed to be deployed on for-profit services in its current iteration. This leaves out any projects that don’t display ads and that aren’t making money with their products. In a worst-case scenario, criminals could switch to platforms like this, making it possible for them to evade the very structures that were created to catch them in the first place.

Given that the internet is international, Chat Control, EARN IT, and STOP CSAM would also have an effect on people who use the same services in other countries. Once scanning practices are implemented in one of the biggest economic zones in the world, it’s not far-fetched to think that other countries implement similar rulings or that companies simplify their setups to follow the same rules everywhere. Something similar happened when the EU’s GDPR went into effect, with its cookie banners showing up on web browsers all around the world, and not just in Europe.

Like in any other area of life, there is no perfect solution to catch criminals. There is always going to be a cat-and-mouse game involved where criminals are trying to evade being detected. Otherwise, we wouldn’t have any crime in the world. It’s clear that mass surveillance isn’t the solution, though. It ties up too many resources and could lead to more busywork than relying on tested methods, like sneaking in investigators into potentially criminal groups and targeting individual suspects.

In the case of STOP CSAM, it’s questionable what exactly the new law is supposed to change when it comes to child abuse. TheEFF writes that“it is already a federal crime for anyone to promote or distribute CSAM,” so the question is why there is a need for extra legislation to go after platforms for “hosting or storing” and “promoting or facilitating” CSAM.

What can you do about STOP CSAM, EARN IT, and Chat Control?

If you live in the US or an EU member state, you may reach out to your representative in Congress or in the European Parliament, asking them push back on the respective laws until they take a form and shape that is less disruptive to the privacy of everyone on the internet. After all, the basic idea behind both STOP CSAM and Chat Control — catch groomers, prevent child abuse, and stop child abuse material from spreading — has noble motives behind it. If you’re not sure how to get started, the EFF makes it easy to make your voice heard, helping youtell Congress that it shouldn’t outlaw encrypted messages.

While there isn’t a clear path for taking action if you’re not based in either region, you can still spread awareness about Chat Control and STOP CSAM and support local or international initiatives that aim to strengthen privacy and protect your rights online around the world.

In the end, the good news is that even if the European Commission attempts to press forward with the Chat Control proposal, it still might be stopped in the courts. According to EU lawyerscited by The Guardian, the current proposal of the EU might be unlawful, and could be overturned as it conflicts with users’ right to privacy. Maybe the bigger question is if STOP CSAM can be stopped, with privacy laws historically not as strong in the US as they are across the pond. In a very real sense, the future of communication online rests upon what happens next.