The Nym Dispatch: The arrest of Durov reconsidered
Are privacy technologies under threat?
Last Sunday, Pavel Durov, CEO of the messaging app Telegram, was arrested by French authorities as his private plane landed outside of Paris.
The list of charges allege Durov of “complicity” in a wide range of illegal activities: possession and distribution of child pornography and narcotics, organized fraud, money laundering, and “providing cryptology services” and “tools” without “prior” and “certified” “declaration.”
Following this year’s conviction of Tornado Cash’s cofounder, there are naturally a lot of big questions in the air. Are privacy technologies under new legal threats? Should tech developers and company executives be held responsible for what users do or say on their platforms?
And then there are other questions, increasingly less considered, behind these new police targets: what should be done to truly address the underlying causes of child exploitation, for example, or the growing epidemic of drug addiction, narcotic impurities, and overdoses? Is compromising the digital privacy of millions of people really the answer?
The arrest of Durov contains an immediate lesson that defenders of privacy and private technologies should pay attention to: data = vulnerability, no matter what the data is, for ordinary people as well as companies.
Contrary to what many people have been led to believe by the current media buzz around Durov’s arrest, Telegram is not a private messaging app. It is not end-to-end encrypted by default, which is the gold standard for secure communications today. Telegram is a lot of things, but private it is not.
Nonetheless, what the arrests of Durov and others are maliciously reinforcing is the idea that using cryptology online is about crime rather than security. It’s against this association that we need to collectively organize to ensure a private internet for the future.
Understanding the charges against Durov
To be clear, Durov is not being charged with distributing child pornography or narcotics, but rather with “complicity” in these activities carried out by Telegram users. As the CEO of a popular telecommunications app, what does this mean exactly?
Under French law, as in many countries, a party can be charged with complicity in a crime if it can be demonstrated that they had “knowledge” of the criminal activity but did nothing — or not enough — to stop, mitigate, or “moderate” it (a term which, as the Nym Dispatch has reported previously, is really euphemism for backdoor surveillance).
At the core of this case is “knowledge,” or what data Telegram was meaningfully aware of. Keep in mind that a communication platform does not need, beyond operational purposes, to have access to any user data. This is ultimately a company decision.
The Telegram problem
No social media or communication platform is simple. In addition to being a messaging and chat app used by 900M users, Telegram’s “relaxed moderation policies” regarding all content on the platform has been criticized. This inevitably includes not simply private chats between individuals, but also conversations on topics ranging from drug dealing to terrorism.
Yet Telegram’s appeal has never been about real privacy. Users turn to it for something else: a seemingly open sphere for communicating with others on any topic, perhaps under the illusion of anonymity and security, and all the messy social dynamics that come with it. Unfortunately, this happens on Telegram largely without the privacy standards that should be expected to ensure genuine free speech.
“Telegram works more as a messy social media platform via group chats than a messaging app,” Jaya Klara Brekke, Nym’s Chief Strategy Officer, notes. “This makes it ripe for exploitation by malicious actors, scammers, and bots. This is a problem across many platforms today. True privacy requires not just end-to-end encryption, but also network noise protection to enable actual free speech and not just a sewage pipe of provocative content designed to push up engagement numbers.”
The fate of Durov
It will ultimately be the job of prosecutors to demonstrate that Durov not only had knowledge of these illegal activities, but that the company failed to meaningfully intervene. The case hinges on a combination of known criminal activity, a laissez-faire attitude towards it, and a “refusal to communicate” with authorities in “carrying out and operating interceptions.” This will amount to enabling criminal activity in a relatively new juridical sense.
It’s unclear whether the French prosecution will succeed in demonstrating this. If they do, it may well contribute to sedimenting a new precedent when it comes to policing privacy technologies and other Web3 platforms (but more on this later). And the mystery remains whether Durov’s decision to land in Paris is part of a deeper geopolitical and legal game to which we’re not yet privy.
But there is one thing that needs to be made clear for the privacy community: Telegram is far from being a private communications app. Moreover, Durov’s current trials and tribulations are a direct consequence of failing or refusing to make Telegram a platform in which users have real privacy by default. If Telegram were, Durov would likely not be in custody.
Privacy is not Telegram’s game
As a messaging and chat application, Telegram has its own strengths: an extremely user-friendly interface, and “channels” in which people can communicate “freely” across the globe and share live information. Maybe it helps people on the ground communicate information about a war constantly inundated with misinformation, or maybe it is a pro-war propaganda machine, or both at the same time. Tools are by nature ambivalent, ultimately shaped by how they are used.
But despite its long-standing claims to be a “secure messenger,” Telegram is not really a genuinely private messaging app. This is for a simple reason: it does not use end-to-end (e2e) encryption as a default.
If it’s not e2e encrypted, it’s not private
Most apps and web services use some form of encryption or another, but unless a connection between two clients is e2e encrypted, it is not genuinely private.
e2e encryption ensures that only you and your intended recipient have the keys to decrypt the content of what you say, do, or share together. Anything else can potentially allow third parties, like the app operators themselves, to gain access to your communications or share them with others, including requesting authorities for many political or legal motives.
It should not be forgotten that compliance with certain digital surveillance requests, in the case of authoritarian regimes or government overreach, can very well amount to an unjustified violation of individuals’ civil rights. It is for this reason that e2e encryption should be defended as the default requirement for all secure and private communications.
In their defense, Telegram does offer an e2e encryption mode, which they cryptically call a “secret chat.” However, this is not a default feature of all communications through the platform. It must be manually toggled on for each chat, meaning that all other communications use an independent and “unusual” form of encryption that is hard to dissect in terms of its security.
A key contrast to Telegram is Signal which ensures e2e encryption for all communications on their platform. Even more commercial apps like WhatsApp (which uses the Signal protocol) and Facebook Messenger have followed suit with e2e encryption, helping to set a new standard in private digital communications.
But Telegram is not yet on this list, and we should be asking why this is before elevating Durov to the pedestal of a defender of private communications.
e2e is already the regulatory norm
Cryptography is not a crime. To the contrary, e2e encryption is now the bedrock of private online communications.
Using strong e2e encryption is fully inline with the General Data Protection Regulation (GDPR), of which France is a complying member. In Europe, the GDPR has rightfully defended the social importance of encryption as “the best way to protect data during transfer and one way to secure stored personal data.” The NIS 2 Directive also further reinforces the need for e2e encryption:
“In order to safeguard the security of electronic communications networks and services, the use of encryption, and in particular end-to-end encryption, should be promoted and, where necessary, should be mandatory for providers.”
In addition to making sure that users’ personal conversations are secure from external interference and surveillance, e2e encryption guarantees that app developers have zero-knowledge of or access to the communication content of their users.
e2e encryption thus offers a necessary neutrality for app developers and operators. This is a win-win for privacy tech and users. As the Durov arrest demonstrates, knowing about and holding user data makes a company vulnerable to prosecution at the same time as it makes all users less private.
This is not about shielding billionaires from criminal prosecution. What this does is normalize our digital interactions so that when ordinary people are speaking to each other over a messaging app, they can do so knowing that hundreds of companies, agencies, and governments are not listening in on them.
Durov in the crosshairs?
Telegram has not only failed to follow the industry standard for e2e encryption: it has also opened Durov himself, as the company’s head, to a possible prosecution. This is now following suit with similar prosecutions in the Web3 space, with Tornado Cash and others. In pragmatic terms, any knowledge of the content of communications on platforms means possible complicity, whether it will be legally justified or not.
e2e encryption for privacy technologies is an easy and fully regulation-compliant solution for protecting data. As is mixnet technology for protecting metadata. These technologies enable data minimization in practice. The more data that a company insists on collecting for commercial and political purposes, the more it makes itself vulnerable, and the worse it makes the privacy of its users.
Durov is now a case in point: freedom of speech defender maybe, but not a digital privacy champion. Let’s stop looking for martyrs and knights in shining armor and invest in zero-knowledge and data-minimizing tech that makes privacy a technological default, not a promise or marketing gimmick.
Metadata is the real target
Strong e2e encryption should be a given for all communications online, protecting vulnerable people from profiling and targeting. Telegram failed to protect their users, and indeed themselves as a digital service provider, on this basic point. But as Nym has highlighted elsewhere, even e2e encryption is not enough to secure people’s communications today.
Metadata continues to be the primary target of data surveillance. In the hands of systems of mass data collection and AI-analytics, this can reveal a lot more about us than the content of any set of messages could. And unlike encrypted content, there are little to no legal protections and regulations of metadata.
Even private communication apps like Signal or WhatsApp are vulnerable to metadata tracking.
“The problem,” Harry Halpin, CEO of Nym Technologies, noted, “is that all these chat apps leak metadata, who is talking to whom, on the network level, which is the precise issue the Nym mixnet is trying to solve.”
In short, unprotected data and metadata = vulnerability, for ordinary app users and developers alike.
But Telegram perhaps leaves its hundreds of millions of users worldwide in an even more precarious situation. Unlike Signal which has no commercial revenue, Telegram earns significant revenue from commercial advertising. As we know, the collection, analysis, and sharing of the metadata of clients is a primary means for targeted advertising. The extent to which Telegram collects, centralizes, and uses client metadata should be a question we’re asking ourselves and them.
Targeting privacy: What’s to come
We will have to wait to see how the Durov case plays out. It very likely might be that it does set a dangerous precedent when it comes to policing real privacy technologies under the auspices of combating crime. The 2024 conviction in the Netherlands of Alexey Pertsev, the cofounder of Tornado Cash, the crypto anonymizing tool, is one vector in this trend. And we should expect others to follow.
As this new legal and policing strategy develops in relation to Web3 tech, as citizens of the web we should be alert in questioning and refusing this false choice: privacy or a world of crime. Privacy can and should be defended against coercion and scaremongering in the name of growing the surveillance state.
But Telegram’s woes cannot be boiled down to their privacy provisions, which are woefully lacking. The real fronts for the battle in defending privacy lie elsewhere.
Private communications apps like Signal and VPNs, and even Telegram, are increasingly blocked in countries like Russia, China, and Venezuela. Building censorship-resistant technology will be crucial in making sure people across the world can access the information and comrades they need to fight oppressive regimes, or simply to live.
So rather than pitting regulators against developers, while the actual criminals roam free, the aim should be to prevent digital vulnerability in the first place. This requires a shift in focus by both regulators and tech CEOs to start prioritizing privacy and security by design and by default.
Telegram, unfortunately, is a day too late.
Share
Table of Contents
Keep Reading...
The Nym Dispatch: X blackout in Brazil
VPNs caught in the crosshairs in row over content regulation
Botnets and backdoors: The free VPN trojan horse
Free VPNs used in massive botnet that exploited millions of devices