(In)Security and Scruff

From The Verge:

Ashley: And then, you mentioned it in transit, do you store these on Scruff’s personal servers? When it’s on the server, is it encrypted? What kind of protections do you have on the server?

We take a number of steps to secure our network. Encryption is a multifaceted and multilayered question and process. Yeah, I can say that the technical architecture of Scruff is one that we have had very smart people look into. We’ve worked with security researchers and security experts to ensure that the data that’s on Scruff stays safe and that our members can use Scruff with confidence and know that their information isn’t going to be disclosed to unauthorized parties.

This is exactly the kind of answer that should set off alarm bells: the developer of Scruff doesn’t actually answer the specific and direction question about the company’s encryption policies in an equivalently direct and specific way. Maybe Scruff really does have strong security protocols in place but you certainly wouldn’t know that was the case based on the answer provided.

It’d be a great idea if someone were to develop the equivalent of the EFF’s or IX Maps’ scorecards, which evaluate the policies of digital and Internet companies, and apply it to online dating services. I wonder how well these services would actually fare when evaluated on their privacy and security and anti-harassment policies…


WhatsApp’s new vulnerability is a concession, not a backdoor

The underlying weakness has to do with alerts rather than cryptography. Although they share the same underlying encryption, the Signal app isn’t vulnerable to the same attack. If the Signal client detects a new key, it will block the message rather than risk sending it insecurely. WhatsApp will send that message anyway. Since the key alert isn’t on by default, most users would have no idea.

It’s a controversial choice, but WhatsApp has good reasons for wanting a looser policy. Hard security is hard, as anyone who’s forgotten their PGP password can attest. Key irregularities happen, and each app has different policies on how to respond. Reached by The Guardian, WhatsApp pointed to users who change devices or SIM cards, the most common source of key irregularities. If WhatsApp followed the same rules as Signal, any message sent with an unverified key would simply be dropped. Signal users are happy to accept that as the price of stronger security, but with over a billion users across the world, WhatsApp is playing to a much larger crowd. Most of those users aren’t aware of WhatsApp’s encryption at all. Smoothing over those irregularties made the app itself simpler and more reliable, at the cost of one specific security measure. It’s easy to criticize that decision, and many have — but you don’t need to invoke a government conspiracy to explain it.

A multitude of secure messaging applications are vulnerable to keys being changed at the server level without the end-user being notified. This theoretically opens a way for state security agencies to ‘break into’ secured communications channels but, to date, we don’t have any evidence of a company in the Western or Western-affiliated world engaging in such behaviours.

There are laws that require some types of communications to be interceptable. Mobile communications carried by telecommunications carriers in Canada must be interceptable, and VoIP along with most other kinds of voice communications that are transmitted by equivalent carriers are subject to interception in the United States. There are not, however, similar demands currently placed on companies that provide chat or other next-generation communications system.

While there are not currently laws mandating either interception or decryption of chat or next-generation communications it remains plausible that laws will be introduced to compel this kind of functionality. It’s that possibility that makes how encryption keys are managed so important: as politicians smell that there is even the possibility of demanding decrypted communications the potential for such interception laws increases dramatically. Such laws would formalize and calcify vulnerabilities into the communications that we use everyday, to the effect of not just ensuring that domestic authorities could always potentially be listening, but foreign and unauthorized parties as well.


Demand for secret messaging apps is rising as Trump takes office

From The Verge:

Marlinspike’s goal isn’t unicorn riches, but unicorn ubiquity. For that, he wants to make encrypted messaging as easy — as beautiful, as fun, as expressive, as emoji-laden — as your default messaging app. His reason: if encryption is difficult, it self-selects for people willing to jump through those hoops. And bad guys are always willing to jump through the hoops. “ISIS or high-risk criminal activity will be willing to click two extra times,” he told me. “You and I are not.”

Marlinspike’s protocol for secure communication is incredibly effective at protecting message content from third party observation. Few protocols are nearly as effective, however, and most chat companies now claim that they offer ‘secure’ communciations. Almost no consumers are situated to evaluate those claims: there are known deficient applications that are widely used, despite the security community having identified and discussed their problems. Encryption isn’t actually going to provide the security that most users think it does so unless the best-of-class protocols are widely adopted.1

The problem of imperfect consumer knowledge is a hard one to solve for, in part because the security community cannot evaluate all claims of encryption. In work that I’ve been involved in we’ve seen simplistic ciphers, hard coded passwords, and similar deficiencies. In some cases companies have asserted they secure data but then fail to encrypt data between smartphone apps and company servers. It’s laborious work to find these deficiencies and it’s cheap for companies to claim that they offer a ‘secure’ product. And it ultimately means that consumers (who aren’t experts in cryptography, nor should they be expected to be such experts) are left scratching their head and, sometimes, just throwing their hands up in frustration as a result of the limited information that is available.

  1. Admittedly, Marlinspike’s goal is to spread his protocol widely and the result has been that the largest chat service in the world, WhatsApp, not provides a robust level of communications security. To activate the protocol in other chat services, such as Google’s Allo or Facebook’s Messenger you need to first set up a private conversation. 



150 Filmmakers Want Nikon and Canon to Sell Encrypted Cameras. Here’s Why

From Wired:

Implementing that feature wouldn’t be simple—particularly in high-definition cameras that have to write large files to an SD card at a high frequency, says Jonathan Zdziarski, an encryption and forensics expert who also works a semi-professional photographer. Integrating encryption without slowing down a camera would likely require not just new software, but new microprocessors dedicated to encrypting files with maximum efficiency, as well as security engineering talent that camera companies likely don’t yet have. He describes the process as “feasible,” but potentially expensive. “I don’t expect Nikon or Canon to know how to do this the way computer companies do. It’s a significant undertaking,” says Zdziarski. “Their first question is going to be, ‘how do we pay for that?‘”

Adding in encryption is a non-trivial undertaking. It’s one that is often done badly. And strong encryption – such that no party can access the content absent a passphrase – also has drawbacks because it you forget that phrase then you’re permanently locked out of the data. As someone who has suffered data loss for exactly that reason I’m incredibly sympathetic that the level of security proposed – opt-in strong security – is not necessarily something that most users want, nor something that most companies want to field support calls over.


Privacy and Policing in a Digital World

As the federal government holds public consultations on what changes should be made to Bill C-51, the controversial anti-terrorism legislation passed by the Conservative government, various police agencies such as the RCMP and the Canadian Association of Chiefs of Police have petitioned to gain new powers to access telephone and internet data. Meanwhile nearly half of Canadians believe they should have the right to complete digital privacy. The Agenda examines the question of how to balance privacy rights with effective policing in the digital realm.

I was part of a panel that discussed some of the powers that the Government of Canada is opening for discussion as part of its National Security consultation, which ends on December 15, 2016. If you want to provide comments to the government, see:


I’m giving up on PGP

This is one of the clearest (and bluntest) critiques of PGP/GPG I’ve read in a long time. It very, very clearly establishes PGP’s inability to successfully protect people facing diverse threat models, the failure of the Web of Trust to secure identities and communities of trust, and challenges of key security and rotation. I’d consider it assigned reading in a university class if the students were ever forced to learn about PGP itself.


Feds Walk Into A Building. Demand Everyone’s Fingerprints To Open Phones


Legal experts were shocked at the government’s request. “They want the ability to get a warrant on the assumption that they will learn more after they have a warrant,” said Marina Medvin of Medvin Law. “Essentially, they are seeking to have the ability to convince people to comply by providing their fingerprints to law enforcement under the color of law – because of the fact that they already have a warrant. They want to leverage this warrant to induce compliance by people they decide are suspects later on. This would be an unbelievably audacious abuse of power if it were permitted.”

Jennifer Lynch, senior staff attorney at the Electronic Frontier Foundation (EFF), added: “It’s not enough for a government to just say we have a warrant to search this house and therefore this person should unlock their phone. The government needs to say specifically what information they expect to find on the phone, how that relates to criminal activity and I would argue they need to set up a way to access only the information that is relevant to the investigation.

It’s insane that the US government is getting chained warrants that authorize expansive searches without clarifying what is being sought or the specific rationales for such searches. Such actions represent an absolute violation of due process.

But, at the same time, the government’s actions (again) indicate the relative weaknesses of the ‘going dark’ arguments. While iPhones and other devices are secured to prevent all actors from illegitimately accessing them, fingerprint-enabled devices can let government agencies bypass security protections with relative ease. This doesn’t mean that fingerprint scanners are bad – most people’s threat models aren’t police, but criminals, snoopy friends and family, etc – but instead that authorities can routinely bypass, rather than need to break, cryptographically-secured communications.