Financial services rest on the twin, but interrelated, pillars of security and privacy to build the kind of financial services we need. And while we can have security without privacy, we cannot have privacy without security.
Real Security And Real Privacy
One of Apple’s excellent security features is Advanced Data Protection (ADP) for iCloud, which offers the company’s highest level of cloud data security. Apple has just announced that this is no longer available to new users in the United Kingdom.
Why does everyone else get security but we Brits don’t? Well, it appears that the British government have ordered Apple (and others) to provide a “back door” into data so that law enforcement and security services — after obtaining a warrant that is approved by a judge — can tap into iPhone back-ups and other cloud data that is otherwise inaccessible, even to Apple itself. This is an extraterritorial power, meaning UK law enforcement would have been able to access the encrypted iCloud data of Apple customers anywhere in the world, including in the US. No wonder it is causing some controversy, to say the least.
What is going on here? Well, to begin at the beginning, the claim that criminals are hiding illegal material by encrypting is certainly true. The question is whether allowing a back door into that data will make us all safer. Is that certainly true?
The answer is no, and I can try to explain why by starting with cod. The fish. My American readers have probably never heard of the “Cod Wars” between Britain and Iceland. If you haven’t, I highly recommend Mark Kurlansky’s excellent 1999 book Cod: Biography of the fish that changed the world. Within its pages is a lovely story of the neverending struggle between security and new technology that provides a useful backdrop to current events: in particular, the Chinese hacking of the American telecommunications infrastructure.
To begin at the beginning. The Anglo-Danish Convention of 1901 gave the British permission to fish up to three miles from the coast of Iceland, a state of affairs that the volcanic colony was most unhappy about. By the late 1920s, the Icelandic Coast Guard had started to arrest British (and German) trawlers found within what it saw as its territorial waters. However, the British trawlers got smart and got harder to catch because from 1928, they were equipped with radio and started passing messages between themselves using secret codes to alert each other when Coast Guard vessels were in and out of harbour.
In an early example of governments attempting to legislate new communications technology, the plucky Icelanders made it illegal send coded wireless messages. This had no impact whatsoever, of course: British seafood pirates simply devised new code systems for the trawlers to use. Think about it: how on Earth would an Icelandic wireless operator know whether “Grandma is home” was a coded message or gibberish?
The moral of the story is that is, as they used to say over at the EFF, when cryptography is outlawed, only outlaws use cryptography. The Icelandic ships couldn’t use coded wireless transmissions, but the bad guys (in this instance, us) ignored the law and were able to operate successfully beyond it. What defeated us in the long run was intelligence and economics (the plucky Icelanders developed net cutters, so they could follow the trawlers and send their very expensive fishing nets to the bottom of the Atlantic), not the ban on coded wireless transmissions.
Making our messages open to access by the government, however well-meaning the advocates of the back door, makes our messages open to terrorists as well. This is not hypothetical concern. In a briefing with reporters about the breach of US phone companies by the Chinese state-sponsored espionage hackers known as “Salt Typhoon” , officials were explicit that Americans should use encryption. Jeff Greene, CISA’s executive assistant director for cybersecurity, was explicit about this. He said that
Encryption is your friend, whether it’s on text messaging or if you have the capacity to use encrypted voice communication.
(The agencies did not name any specific encryption apps, but both Signal and WhatsApp, for instance, end-to-end encrypt calls and texts.)
The data breach exposed calls and texts (including, apparently, from within last year’s presidential campaigns). This is incredibly valuable information. Simply knowing who is calling who is enough to give foreign secret services (or investment scammers) the ability to attack Americans in many different ways.
So how did these hackers get hold of this vast collection of internet traffic from internet service providers that count businesses large and small, and millions of Americans in what one senator called “the worst telecom hack in our nation’s history”? Well, according to the Wall Street Journal, the hackers gained access to “network infrastructure used to cooperate with lawful U.S. requests for communications data”.
In other words, the government made them put in a back door and the hackers walked through it. When the secret key code or backdoor code or whatever is eventually leaked to the bad guys, the good guys are left defensesless.
Security For All
This is not only about Apple and it is not only about the UK. Sweden, for example, has recently joined the list of governments considering passing legislation to make it mandatory for the likes of Signal, WhatsApp, and iMessage to create an encryption backdoor into their software. The President of Signal, Meredith Whittaker, has repeatedly said that they will withdraw their service from countries rather than comply with such demands. WhatsApp said the same thing about its operations in India: it would be forced to leave the country if the government mandates the breaking of end-to-end encryption.
We all want to be protected. While it sounds reasonable to give the government access to data in order to track down bad guys, the unfortunate truth is that forcing back doors into system does more harm than good. On this, I have to say I agree with Silkie Carlo and her team at Big Brother Watch in the UK:
No matter how this is framed, there is simply no such thing as a ‘back door’ that can be limited only to criminals or that can be kept safe from hackers or foreign adversaries. Once encryption is broken for anyone, it’s broken for everyone.
The bad guys will simply use other techniques to encrypt their data while the data of normal people will end up in the hands of the bad guys who will use it for push payment fraud, account takeovers, pig butchering and scams of all kinds.
When it comes to the fundamentals of security, I think that on this Big Tech is right and Big Gov is wrong.
Read the full article here