Why We Actually Need End-to-End Encryption

There is a certain kind of argument that appears every time encryption comes up.

Yes, yes, privacy is lovely. But think of the children!!!

And just like that, the conversation is over. Because once someone has wheeled in children, terrorists, organised crime, and a shadowy man in a basement who definitely has a beard, anyone asking awkward questions about privacy looks like they personally run a fan club for villains.

Which is a shame. Because encryption is one of those subjects where people are invited to have very strong opinions without anyone first bothering to explain what it actually does.

So let’s start there.


When people talk about end-to-end encryption in chat apps like WhatsApp or Signal, the concept is fairly simple. Your message is scrambled on your device and only unscrambled on the recipient’s device. The service carrying it cannot read the contents on the way through.

That is different from what many service providers would quite like, where they carry your messages, store your messages, scan your messages, profile your messages, and then assure you they take your privacy very seriously while monetising your deepest darkest thoughts.

Without end-to-end encryption, your messages are postcards. They may travel through a sophisticated system, but the people handling them can still read what is on the back. With end-to-end encryption, the service is transporting a locked box without the key.

Many people will shrug a bit and as why they should care.

After all, most of us are not discussing plans to overthrow a government or smuggle plutonium. We are mostly sending things like “on my way,” “do we need milk?” and photographs of cats.

But it won’t protect you from everything. If your spouse knows your PIN, your child can unlock your phone with your sleeping face, or you have backed everything up somewhere insecure, family privacy is already hanging by a thread. If you have been hit by spyware, all bets are off.

So what is the actual threat model?

“Threat model” is something a consultant would usually say before invoicing you for a slide deck, and it just means, “who are you actually trying to keep out?”

For most people, the answer is not their mum. It is usually some mix of service providers that would otherwise have access to everything you say, data brokers and advertisers who want another rich stream of information to profile you, and governments that want broad access now, or may want it later when the political mood shifts.

That last one matters more than most people realise.


Rights are easy to support when they protect someone boring. It is when they protect someone inconvenient that people get emotional.

Nobody wakes up in a stable democracy thinking “thank goodness I have the right to private conversation, as I plan to become a dissident by teatime.” Because rights are not built for the days when everything is fine. They are built for the days when everything goes sideways.

This is why “if you have nothing to hide, you have nothing to fear” has always been terrible advice.

You close your curtains. You do not hand your medical records to strangers in the street. You do not send your family group chat to your employer for a laugh. Privacy is not evidence of guilt. It is just a normal human boundary.

More importantly, what counts as suspicious is never fixed. It changes with politics, culture, and power. A protest can become unlawful. A religion can become suspect. A health decision can become criminalised. A minority group can become a convenient scapegoat. History is littered with examples of information that looked harmless at the time and then became very useful to people with ugly intentions later.


Encryption isn’t a magic cloak of invisibility. It does not solve metadata. It does not stop companies collecting revealing information around the edges. It does not stop screenshots, backups, or people leaking their own info.

But partial protection is still protection.

A front door lock does not make your house invincible. It just means random people cannot casually wander in.

E2E encryption works the same way. It places an important limit on who can access the content of your conversations at scale. It stops the platform from simply helping itself. It means governments cannot just ask the provider for everyone’s message content as a matter of routine. It raises the cost of surveillance and data exploitation.


Law enforcement is not wrong to say encryption causes them problems.

Strong encryption makes some investigations harder. European policy papers and justice agency reports say exactly that. Accessing evidence on encrypted devices is difficult, inconsistent, expensive, and time-sensitive. Criminals absolutely do use encrypted services. Operations against EncroChat and Sky ECC led to major arrests and seizures. American agencies argue that wider encrypted messaging reduces the child exploitation material that platforms can proactively detect and report.

None of this is invented. If you are a police officer trying to rescue an abused child or dismantle a violent network, “sorry, privacy won” is not an acceptable outcome.

But just because encryption makes some investigations harder, they say, we therefore need exceptional access, lawful access, special access, technical access, or whatever the branding department is calling a backdoor this week.

A backdoor is not a mystical portal that opens only for the good people, under moonlight, after paperwork. It is a deliberate weakness built into a system that was previously designed not to have one.

This is why technologists and rights groups react so strongly to backdoor proposals. Not because they are indifferent to crime. Because you cannot make encryption weak only for the villains. If you build a method for exceptional access, you have created a new attack surface.

If Britain demands access, why not India? Why not Saudi Arabia? Why not any government that can point to crime, extremism, child protection, or national security and insist this power will obviously only ever be used responsibly?

The problem with building surveillance capability for democracies is that authoritarians also watch the product demos.


This is where “think of the children” becomes both potent and deeply frustrating.

Child protection is real. Child exploitation is real. People who exploit children should be found, stopped, prosecuted, and thrown into the middle of the sea.

But “think of the children” is also an effective way of bulldozing a complicated debate into a false binary, where you are either in favour of privacy or in favour of safety.

In reality, children also need privacy. Abuse victims need privacy. Teenagers need private space. Families escaping domestic violence need secure communications. Journalists, whistleblowers, activists, and people living under authoritarian systems need ways to speak without the state, or the platform, listening in.

This matters even more in countries where speaking against the government can get you arrested. Where religious or ethnic minorities are under pressure. Where simply being the wrong kind of person in the wrong political moment can turn private life into a risky one.


So where does that leave law enforcement?

Not powerless. Just inconvenienced. Which, if we are being honest, is often what this debate is really about.

There are other tools. Device seizure. Digital forensics. Metadata analysis. Undercover work. Informants. Financial records. Cloud data where it exists. Targeted hacking of suspects’ devices with proper legal authority. User reports. Traditional investigative work that involves actual effort rather than demanding everyone’s messaging system be redesigned around state convenience.

None of these are perfect. But they are better than weakening the privacy of entire populations just in case it proves useful later.

There is also a practical problem. If mainstream services are forced to weaken their encryption, serious criminals will not say “fair enough, lads, looks like we’re done for.” They will migrate to other tools, roll their own, use niche platforms, or move to harder targets. The people left with weaker protection will mostly be ordinary users, businesses, activists, and everyone else who did not ask to become collateral damage in a policy argument.


So why do we actually need end-to-end encryption on messaging services?

Not because every message is a state secret, or because everyone is hiding something scandalous from their spouse. It is because private conversation matters as a social norm, not just a specialist requirement. Platforms should not automatically be able to read everything people say. Corporate incentives are awful, government incentives are not always much better, and information collected “just in case” has a nasty habit of getting repurposed later.

You don’t always know in advance which parts of your life will become sensitive. Today it is dull chatter. Tomorrow it might be a health issue, a family crisis, a political opinion, or a message sent from the wrong side of a border. Privacy is mostly there for the ordinary moments. That is what makes it so easy to undervalue right up until you actually need it.

E2E is not sacred or beyond criticism. It does create real problems for some investigations, and anyone pretending otherwise is being silly. But it is not some dangerous indulgence for tech romantics either. It is a sensible boundary in a world where too many institutions would quite like a permanent seat inside your private life.

The debate, if we are going to have it properly, should not be framed as privacy versus children, or encryption versus safety. It should be asking the right questions. What powers should the state have? What powers should companies have? What risks are we prepared to impose on everyone to make some investigations easier? And are we building systems for today’s reasonably decent government, or for tomorrow’s less decent one?