Facebook's Message Encryption Was Built to Fail

The chat between a teen and her mom about an alleged abortion helped police build their case. Default end-to-end encryption would help others avoid their fate.
Blue sticky note shaped like a message bubble on a blue background
Photograph: Javier Zayas Photography/Getty Images

The details are chilling. Police raiding a home, a teenager and her mother arrested, fetal remains exhumed from a rural burial plot. When police dragged off a 17-year-old Nebraska girl and charged her and her mother with self-administering a miscarriage, they were armed with damning documents they could only access through the incompetence and cooperation of Meta.

The intimate conversation between a mother and daughter in the days surrounding an alleged abortion was just one of the millions logged by Facebook every day, but for this family it will be devastating. After police obtained a warrant for the girl’s Facebook data, they used the information the company provided to apply for a second search warrant to raid her home. The application for that warrant included quotes from the pair’s Messenger conversation, such as “Are we starting it today?” and “Ya the 1 pill stops the hormones…u gotta wait 24 HR 2 take the other.” Perhaps most damning of all, the closing remark: “remember we burn the evidence.”

Search warrants require probable cause, particularized evidence to show that law enforcement will be able to obtain even more evidence at the place being searched. In this case, police could prove the value of searching the girl’s home, in part, because of the records they received from Facebook. That second warrant allowed them to not only search the family’s home, but collect any electronic devices, medications, and other records. The mother and daughter are now both facing criminal charges.

In this case, like so many others, Facebook was an early target for an investigation. The ubiquitous platform is relied upon by billions of users globally, a repository of countless fleeting and self-incriminating thoughts. While users know that anything they post publicly on Facebook is likely visible to the world, including to law enforcement, Messenger creates a false sense of privacy. Right now, the overwhelming majority of messages are unencrypted, visible to Meta staff and anyone with a valid warrant. And that’s by design—Facebook built its message encryption feature to fail.

In the aftermath of the Dobbs decision, Zuckerberg promised employees that Meta would use encryption to “keep people safe.” But the reality is that it does nothing of the sort. While this one investigation in Nebraska was launched before the Dobbs decision came down, post-Roe abortion policing will only accelerate, along with growing numbers of warrants to Meta. 

Currently, Messenger supports encryption, but only if users opt in. This isn’t an easy, one-time toggle—it’s an agonizing process to plod through for each and every person you communicate with. And once you do manage to opt in to this “secret conversations” feature, Facebook will create a new message thread, meaning you fracture your messaging history and split every conversation or group in two. It’s a giant mess.

Worse yet, opting in to encrypted conversations now does nothing to protect months and years of past messages. Meta has created so many barriers that the vast majority of messages will be completely exposed. And even once encryption is set up, it’s easy to accidentally revert to unencrypted chats. Considering Meta is built on dark patterns and the subtle use of highly engineered products to shape user behavior, it’s clear the company doesn’t actually want user conversations to be encrypted.

Even when Meta does make encryption easy, like on the ubiquitous WhatsApp chat app, it finds ways to undermine it. Unlike Messenger, every conversation on WhatsApp is encrypted. But that only means Meta can’t directly access the message. Instead, the company can circumvent the safeguard by monitoring copies of messages decrypted and flagged by users.

Zuckerberg doesn’t want his company to be seen as the long arm of abortion laws, but that’s exactly what it will continue to be if Meta doesn’t act. The solutions aren’t hard; the leadership is. It doesn’t help that police and surveillance advocates want Meta to continue to mine our conversations for illegal content.

Advocates like the nonprofit Thorn claim secure communications would “threaten child safety gains.” For these encryption opponents, a private internet is a threat to public safety. But opponents’ “stranger danger” encryption fears have been grossly abused for years, and the much more potent danger comes from these groups’ persistent opposition to effective encryption. In a post-Roe America, all of the terrifying surveillance powers that were misused by police in the name of protecting children will now be retargeted in the name of “unborn children.”

For firms like Meta that have put their users at risk by holding on to too much data for too long, the solution is clear: Encrypt it or delete it. But their obligations don’t stop there. In Nebraska, only nine days passed between when police demanded Facebook records and when they cited them as evidence to raid a home.

Today, it is cheap and effortless for police to commandeer our Facebook files and use them against us. In this case, it seems likely that the warrant to Meta made it clear, even if it did not say so explicitly, that police were investigating abortion care. (Nebraska law requires a warrant to include the “substance of the accusation.”) Rather than simply handing over everything police sought, Meta could have fought them in court, leveraging the army of lawyers it usually only uses to fight off regulation. Of course, Meta might have eventually lost and found itself forced to comply anyway, but the least such tech companies could do is make this tactic an expensive one for law enforcement to pursue.

Meta, like so many other tech firms, keeps saying it wants to protect users. The question is whether we should believe words or actions.