This is one of the clearest (and bluntest) critiques of PGP/GPG I’ve read in a long time. It very, very clearly establishes PGP’s inability to successfully protect people facing diverse threat models, the failure of the Web of Trust to secure identities and communities of trust, and challenges of key security and rotation. I’d consider it assigned reading in a university class if the students were ever forced to learn about PGP itself.
In response to Edward Snowden’s mass surveillance revelations, Google is working to make complex encryption tools, such as PGP, easier to use in Gmail.
PGP, or Pretty Good Privacy, is an encryption utility that historically has been difficult to break. But Google has “research underway to improve the usability of PGP with Gmail,” according to a person at the company familiar with the matter.
If Google is actually going to throw engineers and designers (most important: lots, and lots, and lots of UI and UX designers!) towards improving the basic usability of PGP that would be incredible. However, given people’s suspicion of the company given the NSA disclosures I have to wonder whether any public offering from Google will be regarded as some kind of a trojan horse by some civil liberties groups and the cynical public alike.
Patrick Ball has a good and highly accessible article over on Wired about why certain means of securing communications are problematic. It’s highly recommended. Rather than leave you with the overview of “this is what is said and why it’s important,” let me leave you with a key quotation from the article that (to my mind) nicely speaks to the author’s general mindset: “Good security is about not trusting people. It’s about studying math and software and assuring that the program cannot be turned to bad intent.”
Declan McCullagh has an article on an important case in the US, where a federal judge has demanded a defendant decrypt a PGP-encrypted drive for the authorities. Case law in the area of decryption is unsettled, as McCullagh notes:
The question of whether a criminal defendant can be legally compelled to cough up his encryption passphrase remains an unsettled one, with law review articles for at least the last 15 years arguing the merits of either approach. (A U.S. Justice Department attorney wrote an article in 1996, for instance, titled “Compelled Production of Plaintext and Keys.”)
Much of the discussion has been about what analogy comes closest. Prosecutors tend to view PGP passphrases as akin to someone possessing a key to a safe filled with incriminating documents. That person can, in general, be legally compelled to hand over the key. Other examples include the U.S. Supreme Court saying that defendants can be forced to provide fingerprints, blood samples, or voice recordings.
On the other hand are civil libertarians citing other Supreme Court cases that conclude Americans can’t be forced to give “compelled testimonial communications” and extending the legal shield of the Fifth Amendment to encryption passphrases. Courts already have ruled that that such protection extends to the contents of a defendant’s minds, the argument goes, so why shouldn’t a passphrase be shielded as well?
Eventually the case law around encryption has to be addressed by SCOTUS. There are too many differing positions at the moment; clarity is needed both for users of encryption in the US, and for counsel seeking to prosecute and defence clients.
Robert Sosinski has a good walkthrough of setting up GPG in OS X. Hopefully we’ll see some non-console-based instructions sometime in the near future to help those who are gun-shy when presented with a command prompt!