News

The Regime in Washington Can’t Be Trusted With Our Data

For more reasons than you can imagine — but especially because of multiracialism and Jews

WIRED MAGAZINE TELLS us that a solution may have been found which will allow government to access private data on smart phones without compromising everyone’s privacy:

A few months after the San Bernardino shooting, President Obama sat for an interview at the South by Southwest conference and argued that government officials must be given some kind of shortcut — or what’s known as exceptional access — to encrypted content during criminal and antiterrorism investigations. “My conclusion so far is that you cannot take an absolutist view on this,” he said. “If the tech community says, ‘Either we have strong, perfect encryption or else it’s Big Brother and an Orwellian world’ — what you’ll find is that after something really bad happens, the politics of this will swing and it will become sloppy and rushed, and it will go through Congress in ways that have not been thought through. And then you really will have dangers to our civil liberties.”

The unstated problem with giving the government “back door” keys is that governments leak secrets all the time. Organized crime and foreign powers both send secret agents to work for our government’s departments and its civil service. We’re especially vulnerable to foreign espionage for two reasons: (1) We’re “multicultural” and can’t recognize potential spies on sight; and (2) we have lots of dual-citizenship Jews among us who are loyal to Israel while merely feigning loyalty to the United States. Sooner or later, one of these would ferret out the government’s back door key, and then the entire population would be threatened with embarrassment, blackmail, financial or physical harm, and so on. I doubt that the government’s secret “back door” would remain secret for even ten years.

* * *

APPENDIX:
Ozzie and the Crypto Wars

ON DECEMBER 2, 2015, a man named Syed Rizwan Farook and his wife, Tashfeen Malik, opened fire on employees of the Department of Public Health in San Bernardino, California, killing 14 people and injuring 22 during what was supposed to be a staff meeting and holiday celebration. The shooters were tracked down and killed later in the day, and FBI agents wasted no time trying to understand the motivations of Farook and to get the fullest possible sense of his contacts and his network. But there was a problem: Farook’s iPhone 5c was protected by Apple’s default encryption system. Even when served with a warrant, Apple did not have the ability to extract the information from its own product.

The government filed a court order, demanding, essentially, that Apple create a new version of the operating system that would enable it to unlock that single iPhone. Apple defended itself, with CEO Tim Cook framing the request as a threat to individual liberty.

“We have a responsibility to help you protect your data and protect your privacy,” he said in a press conference. Then-FBI chief James Comey reportedly warned that Cook’s attitude could cost lives. “I just don’t want to get to a day where people look at us with tears in their eyes and say, ‘My daughter is missing and you have her cell phone — what do you mean you can’t tell me who she was ­texting before she disappeared?’ ” The controversy over Farook’s iPhone reignited a debate that was known in the 1990s as the Crypto Wars, when the government feared the world was “going dark” and tried — and ultimately failed — to impede the adoption of technologies that could encode people’s information. Only this time, with super­computers in everybody’s pockets and the endless war on terror, the stakes were higher than ever.

A few months after the San Bernardino shooting, President Obama sat for an interview at the South by Southwest conference and argued that government officials must be given some kind of shortcut — or what’s known as exceptional access — to encrypted content during criminal and antiterrorism investigations. “My conclusion so far is that you cannot take an absolutist view on this,” he said. “If the tech community says, ‘Either we have strong, perfect encryption or else it’s Big Brother and an Orwellian world’ — what you’ll find is that after something really bad happens, the politics of this will swing and it will become sloppy and rushed, and it will go through Congress in ways that have not been thought through. And then you really will have dangers to our civil liberties.”

In typical Obama fashion, the president was leaning toward a compromise, a grand bargain between those who insist that the NSA and FBI need all the information they can get to monitor potential terrorists or zero in on child abusers and those who believe building any sort of exceptional access into our phones would be a fast track to a totalitarian surveillance state. And like so many of Obama’s proposed compromises, this one went nowhere. To many cryptographers, there was simply no way that companies like Apple and Google could provide the government with legal access to customer data without compromising personal privacy and even national security. Exceptional access was a form of technology, after all, and any of its inevitable glitches, flaws, or bugs could be exploited to catastrophic ends. To suggest otherwise, they argued, was flat wrong. Flat-Earth wrong. Which was, as any good engineer or designer knows, an open invitation for someone to prove them wrong.

THIS PAST JANUARY, Ray Ozzie took a train from his home in Massachusetts to New York City for a meeting in a conference room of the Data Science Institute at Columbia University. The 14th-­floor aerie was ringed by wide windows and looked out on a clear but chilly day. About 15 people sat around the conference table, most of them middle-­aged academics — people from the law school, scholars in government policy, and computer scientists, including cryptographers and security specialists — nibbling on a light lunch while waiting for Ozzie’s presentation to begin.

Jeannette Wing — the host of the meeting and a former corporate VP of Microsoft Research who now heads the Data Science Institute — introduced Ozzie to the group. In the invitation to this “private, informal session,” she’d referenced his background, albeit briefly. Ozzie was once chief technical officer at Microsoft as well as its chief software architect, posts he had assumed after leaving IBM, where he’d gone to work after the company had purchased a product he created, Lotus Notes. Packed in that sentence was the stuff of legend: Notes was a groundbreaking product that rocketed businesses into Internet-style communications when the Internet was barely a thing. The only other person who ever held the chief software architect post at Microsoft was Bill Gates, and Ozzie had also helped create the company’s cloud business.

He had come to Columbia with a proposal to address the impasse over exceptional access, and the host invited the group to “critique it in a constructive way.” Ozzie, trim and vigorous at 62, acknowledged off the bat that he was dealing with a polarizing issue. The cryptographic and civil liberties community argued that solving the problem was virtually impossible, which “kind of bothers me,” he said. “In engineering if you think hard enough, you can come up with a solution.” He believed he had one.

He started his presentation, outlining a scheme that would give law enforcement access to encrypted data without significantly increasing security risks for the billions of people who use encrypted devices. He’d named his idea Clear.

HOW CLEAR WORKS

Step 1
Obtain warrant for locked, encrypted phone that is evidence in a criminal investigation.

Step 2
Access special screen that generates a QR code containing an encrypted PIN.

Step 3
Send picture of QR code to the phone’s manufacturer, which confirms the warrant is legal.

Step 4
Manufacturer transmits decrypted PIN to investigators, who use it to unlock the phone.

It works this way: The vendor — say it’s Apple in this case, but it could be Google or any other tech company — starts by generating a pair of complementary keys. One, called the vendor’s “public key,” is stored in every iPhone and iPad. The other vendor key is its “private key.” That one is stored with Apple, protected with the same maniacal care that Apple uses to protect the secret keys that certify its operating system updates. These safety measures typically involve a tamper-­proof machine (known as an HSM or hardware security module) that lives in a vault in a specially protected building under biometric lock and smartcard key.

That public and private key pair can be used to encrypt and decrypt a secret PIN that each user’s device automatically generates upon activation. Think of it as an extra password to unlock the device. This secret PIN is stored on the device, and it’s protected by encrypting it with the vendor’s public key. Once this is done, no one can decode it and use the PIN to unlock the phone except the vendor, using that highly protected private key.

So, say the FBI needs the contents of an iPhone. First the Feds have to actually get the device and the proper court authorization to access the information it contains — Ozzie’s system does not allow the authorities to remotely snatch information. With the phone in its possession, they could then access, through the lock screen, the encrypted PIN and send it to Apple. Armed with that information, Apple would send highly trusted employees into the vault where they could use the private key to unlock the PIN. Apple could then send that no-longer-secret PIN back to the government, who can use it to unlock the device.

Ozzie designed other features meant to ­reassure skeptics. Clear works on only one device at a time: Obtaining one phone’s PIN would not give the authorities the means to crack anyone else’s phone. Also, when a phone is unlocked with Clear, a special chip inside the phone blows itself up, freezing the contents of the phone thereafter. This prevents any tampering with the contents of the phone. Clear can’t be used for ongoing surveillance, Ozzie told the Columbia group, because once it is employed, the phone would no longer be able to be used.

He waited for the questions, and for the next two hours, there were plenty of them. The word risk came up. The most dramatic comment came from computer science professor and cryptographer Eran Tromer. With the flair of Hercule Poirot revealing the murderer, he announced that he’d discovered a weakness. He spun a wild scenario involving a stolen phone, a second hacked phone, and a bank robbery. Ozzie conceded that Tromer found a flaw, but not one that couldn’t be fixed.

* * *

Source: David Sims and WIRED

Previous post

1795: Jews Profited from First US-Muslim Conflict

Next post

Jewish Subversion of History: Leonard Dinnerstein, part 1

1 Comment

  1. Axis Sally
    28 April, 2018 at 9:00 pm — Reply

    In the Farook case, instead of trying to hack the iPhone directly, the FBI could have merely purchased the desired data from Facebook.

Leave a reply

Your email address will not be published. Required fields are marked *

Slander, crude language, incivility, off-topic drift, or remarks that might harm National Vanguard or its users may be edited or deleted, even if unintentional. Comments may be edited for clarity or usage.