


She got the first big thing right: protecting our technology industry and all Americans from the Luddite demands of Britain’s Labour government.
T o my mind, the Office of the Director of National Intelligence (ODNI) should not exist. Like the Department of Homeland Security, it arose out of Congress’s natural political impulse, following the 9/11 atrocities, to appear to be doing . . . something. Lawmakers diagnosed that the principal government dereliction had been the failure to ensure that intelligence was shared across agencies. But the main obstruction in that regard had been the reckless Justice Department policy of preventing intelligence agents and criminal investigators from communicating. Most of that problem could be remedied by reversing the policy (which was done). More to the point, if the illness is lack of coordination across 17 agencies of what’s known as the U.S. intelligence “community,” the cure is not adding yet an 18th bureaucracy.
For that reason, while I can’t claim to have been enthusiastic when President Trump nominated Hawaii’s Tulsi Gabbard, the former Democratic representative, to head up ODNI, neither was I a full-throated opponent. It’s not exactly that I disagreed with our editorial’s detailed indictment of Gabbard’s lapses in judgment. It’s that I emphatically agreed with the editorial’s first sentence that “the job is not nearly as important as it sounds” — although it is more important than it should be because it exists. (I know Elon Musk is a Tulsi fan, and I’m sure there are necessary executive tasks she’d be quite good at; but if he and his DOGE-ies really want to pare down Leviathan, ODNI would have been a good place to start.)
Today, though, I come to praise Tulsi, not to bury her: Congratulations to Director Gabbard for getting the first big thing right: protecting our technology industry and all Americans from the Luddite and privacy-busting demands of Britain’s Labour government.
As the Washington Post reports, the Home Office (the U.K.’s internal security service) ordered Apple to fashion a “backdoor” that would allow investigators to defeat encryption in its Advanced Data Protection cloud-storage product. Encryption, of course, renders data nearly impossible to decipher. In that regard, it equally frustrates hackers and governments. It is thus a technology that makes Apple’s communications products popular to consumers who prize their privacy and security.
Not only was the Home Office directive outrageous on its face; the Brits also forbade Apple from disclosing the order to Congress and U.S. officials (in addition to other third parties). In sum, our ally is undermining American security and privacy rights based on its ill-conceived perception of its own security interests — and hoped we wouldn’t find out about it.
We’ve seen this sort of thing before. To its credit, Apple does not build backdoors into its products, fighting even our own government on occasions when it has made similar demands. True to this conceit, the tech giant said it would make its Advanced Data Protection cloud-storage product unavailable to customers in the U.K. — although that may not be enough to satisfy British investigators unless our own government comes to Apple’s aid . . . as Gabbard appears poised to do.
For what it’s worth, my long-held views in support of Apple’s stance are unpopular among many national security–minded conservatives with whom I tend to agree on an array of subjects. But I continue to believe that this presumptuousness on the part of Western democracies — who’ve allowed the need for strong counterterrorism measures to tempt them into security-state practices — is a betrayal of what makes us Western democracies, and a benighted one at that.
To begin with, if encryption is good enough, the blithe assumption that it could readily be defeated by backdoors should be rethought. A basic objective of encryption is effective resistance to penetration efforts. It doesn’t necessarily follow that one who designs an encryption algorithm can then decrypt the data to which it is applied.
Even if you can solve that challenge, there is no “white hat” backdoor. Any key that proves capable of decryption is going to work for the guys in the black hats, too. Backdoors would inevitably make any encryption service vulnerable to hacking. And one need only skim the newspapers for what sometimes seems like daily reports of Chinese hacking into supposedly super-secure American government data systems to appreciate that no backdoor built for government investigators is going to be safe from cybercriminals.
Investigators overcome problematic technology with better technology. If you’re worried about terrorists using encrypted communications — and who isn’t? — then the government has to get better at decryption, not stifle innovation. Scientific and technological process can’t call a time-out while our agencies try to catch up. If you think it can, then you’re ceding ground to regimes that don’t indulge in such self-delusion, and those regimes can be hostile.
Terrorist use of encryption doesn’t happen in a vacuum. Nothing does. There are always tradeoffs. All of our lives are now in cyberspace, to say nothing of all the government’s security arrangements. If they weren’t, we couldn’t use artificial intelligence and other digital technology to improve our security. The tradeoff: The technology that allows us to protect our privacy, our financial records, and our defense and intelligence operations can also be used by bad actors to facilitate their activities. That’s why we have to have a broad spectrum of investigative tools to use against them — we can’t and we never have relied solely on electronic surveillance. If we undermined encryption because terrorists and criminals might use it, we’d paradoxically be more vulnerable to terrorists and criminals.
Moreover, while many government agents chafe at this, the fact is that privacy — very much including privacy from the prying eyes of government — is popular. The public in a free society gets to balance the unavoidable tension between liberty and security, based on what the threat environment happens to be at a given time. Often, citizens want less security than government officials think is good for them. It’s not that they want to be vulnerable to terrorism, it’s that they’re wary of government-proposed solutions that could, in the aggregate, be worse than the threat. People can rationally decide that privacy of their communications and data is worth more to them than the marginal increase in safety that may flow from compromising encryption so that terrorists can be more comprehensively monitored. In our free, constitutional republic, that’s a choice they get to make.
As a prosecutor, I grew up on wiretap cases, starting with organized crime cases (including international syndicates) and ending up in national security probes of terrorist networks that had operatives embedded in America but were mainly based in overseas safe havens (in rogue states or ungoverned badlands). The history of technologically enhanced surveillance methods is interesting. It evolved not just because of advances in capabilities; nearly as important is the cat-and-mouse game between police and criminals, or between intelligence officers and agents of foreign powers. Surveillance improves out of necessity, because the bad guys figure out ways to detect and circumvent it.
This dynamic, in a society in which Fourth Amendment liberties constrain government, unfolded under the principle that the police require probable cause before they may search and seize (which includes intercepting communications). The state doesn’t — or, at least, didn’t — get to seize first, then search, then figure out whether it had probable cause.
In the 1990s, terrorism and technology turned this world on its head. Terrorism eventually transformed federal law enforcement. It once had a rule-of-law framework, in which people are presumed innocent and investigators methodically build cases after crimes have occurred. It shifted to an intelligence mindset, in which the goal is not to manage crime problems but to prevent bad things from occurring — a mindset that inexorably erodes due process. I’m often asked, “What happened to the FBI?” Well, that’s what happened.
During the same time frame in which the terrorist threat surged, there was a telecom revolution. Digital technology suddenly made it possible for government to vacuum up mountains of data, including communications, instantly and — significantly — without discrimination.
But see, investigation had always been precisely about discrimination, about establishing a basis for suspicion before targeting and gathering up potentially incriminating data. Now, however, it was much easier to gather up everything and sort it out later: Seize first, then come up with a lawful basis to search through the database. And while that might have provoked serious consideration of whether what we were now capable of doing could be squared with the Fourth Amendment and American expectations about privacy, the terrorist threat muffled such concerns. After all, the objective was to forfend attacks and strangle jihadist cells in the cradle, if possible. You can worry about that, or you can worry about due process; it’s really hard to worry about both concurrently. That’s why intelligence and policing, while equally essential, are discrete skill sets.
Even after 30 years, I don’t think we’ve landed in a comfortable place regarding all this. I don’t think we’ve really grappled with the ramifications of how technological capabilities and a mass-casualty threat mosaic have altered the way we think about what our government should be empowered to do and how much privacy — if any — we should forfeit toward that end.
But I do know one thing: We don’t want foreign governments, even our closest friends, dictating terms to American companies — terms that will affect the highest interests of American citizens. We have to sort out liberty and security on American terms.
Don’t get me wrong. I want our intelligence agencies to aggressively target overseas terrorist havens and operations. Unlike my colleagues in the aforementioned editorial (and apparently unlike Tulsi Gabbard in her new incarnation), I am not a fan of Section 702, which regulates the manner in which our government collects data on foreigners outside the U.S. That’s not because I am terribly worried about privacy; to the contrary, I think that the FISA system is misconceived, that it unduly hampers foreign intelligence operations, and that the notion of having a court, rather than Congress, oversee executive intelligence operations is nuts. Nevertheless, law enforcement and intelligence monitoring that affects the legitimate privacy expectations of Americans at home, where the Constitution reigns, is a different thing entirely.
Director Gabbard told Congress she was astonished by the British demand for a backdoor. Such technology “would allow access to Americans’ personal encrypted data,” she wrote, and “would be a clear and egregious violation of Americans’ privacy and civil liberties, and open up a serious vulnerability for cyber exploitation by adversarial actors.” That’s a worthy start. I was an investigator, so I am not without sympathy for what investigators want. But the crucial question is what the American people want.