THE AMERICA ONE NEWS
Feb 25, 2025  |  
0
 | Remer,MN
Sponsor:  QWIKET AI 
Sponsor:  QWIKET AI 
Sponsor:  QWIKET AI: Interactive Sports Knowledge.
Sponsor:  QWIKET AI: Interactive Sports Knowledge and Reasoning Support.
back  
topic


NextImg:What the U.K. Wants from Apple Will Make Our Phones Less Safe

View Comments ()

Last month, the U.K. government demanded that Apple weaken the security of iCloud for users worldwide. On Friday, Apple took steps to comply for users in the United Kingdom. But the British law is written in a way that requires Apple to give its government access to anyone, anywhere in the world. If the government demands Apple weaken its security worldwide, it would increase everyone’s cyber-risk in an already dangerous world.

If you’re an iCloud user, you have the option of turning on something called “advanced data protection,” or ADP. In that mode, a majority of your data is end-to-end encrypted. This means that no one, not even anyone at Apple, can read that data. It’s a restriction enforced by mathematics—cryptography—and not policy. Even if someone successfully hacks iCloud, they can’t read ADP-protected data.

Using a controversial power in its 2016 Investigatory Powers Act, the U.K. government wants Apple to re-engineer iCloud to add a “back door” to ADP. This is so that if, sometime in the future, U.K. police wanted Apple to eavesdrop on a user, it could. Rather than add such a back door, Apple disabled ADP in the U.K. market.

Should the U.K. government persist in its demands, the ramifications will be profound in two ways. First, Apple can’t limit this capability to the U.K. government, or even only to governments whose politics it agrees with. If Apple is able to turn over users’ data in response to government demand, every other country will expect the same compliance. China, for example, will likely demand that Apple out dissidents. Apple, already dependent on China for both sales and manufacturing, won’t be able to refuse.

Second: Once the back door exists, others will attempt to surreptitiously use it. A technical means of access can’t be limited to only people with proper legal authority. Its very existence invites others to try. In 2004, hackers—we don’t know who—breached a back-door access capability in a major Greek cellphone network to spy on users, including the prime minister of Greece and other elected officials. Just last year, China hacked U.S. telecoms and gained access to their systems that provide eavesdropping on cellphone users, possibly including the presidential campaigns of both Donald Trump and Kamala Harris. That operation resulted in the FBI and the Cybersecurity and Infrastructure Security Agency recommending that everyone use end-to-end encrypted messaging for their own security.

Apple isn’t the only company that offers end-to-end encryption. Google offers the feature as well. WhatsApp, iMessage, Signal, and Facebook Messenger offer the same level of security. There are other end-to-end encrypted cloud storage providers. Similar levels of security are available for phones and laptops. Once the U.K. forces Apple to break its security, actions against these other systems are sure to follow.

It seems unlikely that the U.K. is not coordinating its actions with the other “Five Eyes” countries of the United States, Canada, Australia, and New Zealand: the rich English-language-speaking spying club. Australia passed a similar law in 2018, giving it authority to demand that companies weaken their security features. As far as we know, it has never been used to force a company to re-engineer its security—but since the law allows for a gag order we might never know. The U.K. law has a gag order as well; we only know about the Apple action because a whistleblower leaked it to the Washington Post. For all we know, they may have demanded this of other companies as well. In the United States, the FBI has long advocated for the same powers. Having the U.K. make this demand now, when the world is distracted by the foreign-policy turmoil of the Trump administration, might be what it’s been waiting for.

The companies need to resist, and—more importantly—we need to demand they do. The U.K. government, like the Australians and the FBI in years past, argues that this type of access is necessary for law enforcement—that it is “going dark” and that the internet is a lawless place. We’ve heard this kind of talk since the 1990s, but its scant evidence doesn’t hold water. Decades of court cases with electronic evidence show again and again the police collect evidence through a variety of means, most of them—like traffic analysis or informants—having nothing to do with encrypted data. What police departments need are better computer investigative and forensics capabilities, not back doors.

We can all help. If you’re an iCloud user, consider turning this feature on. The more of us who use it, the harder it is for Apple to turn it off for those who need it to stay out of jail. This also puts pressure on other companies to offer similar security. And it helps those who need it to survive, because enabling the feature couldn’t be used as a de facto admission of guilt. (This is a benefit of using WhatsApp over Signal. Since so many people in the world use WhatsApp, having it on your phone isn’t in itself suspicious.)

On the policy front, we have two choices. We can’t build security systems that work for some people and not others. We can either make our communications and devices as secure as possible against everyone who wants access, including foreign intelligence agencies and our own law enforcement, which protects everyone, including (unfortunately) criminals. Or we can weaken security—the criminals’ as well as everyone else’s.

It’s a question of security vs. security. Yes, we are all more secure if the police are able to investigate and solve crimes. But we are also more secure if our data and communications are safe from eavesdropping. A back door in Apple’s security is not just harmful on a personal level, it’s harmful to national security. We live in a world where everyone communicates electronically and stores their important data on a computer. These computers and phones are used by every national leader, member of a legislature, police officer, judge, CEO, journalist, dissident, political operative, and citizen. They need to be as secure as possible: from account takeovers, from ransomware, from foreign spying and manipulation. Remember that the FBI recommended that we all use back-door-free end-to-end encryption for messaging just a few months ago.

Securing digital systems is hard. Defenders must defeat every attack, while eavesdroppers need one attack that works. Given how essential these devices are, we need to adopt a defense-dominant strategy. To do anything else makes us all less safe.