The Roundup for May 12-18, 2018 Edition

Soar by Christopher Parsons

It’s become incredibly popular to attribute the activities undertaken by the Facebooks and Googles of the work to ‘surveillance capitalism’. This concept generally asserts that the current dominant mode of economics has become reliant on surveillance to drive economic growth. Surveillance, specifically, is defined as the act of watching or monitoring activity with the intent of using captured information to influence behaviour. In the world of the Internet, this information tends to be used to influence purchasing behaviours.

The issue that I have with the term surveillance capitalism is that I’m uncertain whether it comprehensively captures the activities associated with the data-driven economy. Surveillance Studies scholars tend to apply the same theories which are used to understand CCTV to practices such as machine learning; in both cases, the technologies are understood as establishing feedback loops to influence an individual or entire population. But, just as often, neither CCTV nor machine learning actually have a person- or community-related feedback loop. CCTV cameras are often not attended to, not functional, or don’t provide sufficient information to take action against those being recorded. Nor do individuals necessarily modify their own behaviours in the presence of such cameras. Similarly, machine learning algorithms may not be used to influence all persons: in some cases, they may be sufficiently outside the scope of whatever the algorithm is intended to do that they are not affected. Also, like CCTV, individuals may not modify their own behaviours when machine learning algorithms are working on the data those individuals are generating on the basis of being unaware of machine learning operating on their data.

So, where surveillance capitalism depends on a feedback loop that is directly applied towards individuals within a particular economic framework, there may be instances where data is collected and monetized without clear or necessary efforts to influence individuals. Such situations could include those where a machine learning algorithm is designed to improve a facial recognition system, or improve battery life based on the activities undertaken by a user, or to otherwise very quietly make tools more effective without a clear attempt to modify user behaviour. I think that such activities may be very clearly linked to monetization and, more broadly, an ideology backed by capitalism. But I’m not sure it’s surveillance as it’s rigorously defined by scholars.

So one of the things that I keep thinking about is whether we should shift away from the increasingly-broad use of ‘surveillance capitalism’ to, more broadly, talk about ‘data capitalism’. I’m not suggesting doing away with the term surveillance capitalism but, instead, that surveillance capitalism is a sub-genus of data capitalism. Data capitalism would, I believe, better capture the ways in which information is collected, analyzed, and used to effect socio-technical changes. Further, I think such a term might also capture times where those changes are arguably linked to capitalist aims (i.e. enhancing profitability) but may be less obviously linked to the feedback loops towards individuals that are associated with surveillance itself.


After approximately twenty months of work, my colleagues and myself have published an extensive report on encryption policies in Canada. It’s a major accomplishment for all of us to have finally concluded the work, and we’re excited by the positive feedback we’ve received about it.


Inspiring Quotation of the Week

“Ambition is a noble passion which may legitimately take many forms… but the noblest ambition is that of leaving behind something of permanent value.”

– G.H. Hardy

Great Photography Shots

Some of these storm chaser photos are practically otherworldly.

Music I’m Digging

Neat Podcast Episodes

Good Reads for the Week

Cool Things

Another Bad Proposal to Globally Weaken Security

federica-galli-449563-unsplash
Photo by Federica Galli on Unsplash

Steven Levy has an article out in Wired this week in which he, vis-a-vis the persons he interviewed, proclaims that the ‘going dark’ solution has been solved to the satisfaction of (American) government agencies and (unnamed and not quoted) ‘privacy purists’.1 Per the advocates of the so-called-solution, should the proposed technical standard be advanced and developed then (American) government agencies could access encrypted materials and (American) users will enjoy the same degrees of strong encryption as they do today. This would ‘solve’ the problem of (American) agencies’ investigations being stymied by suspects’ adoption of encrypted communications systems and personal devices.

Unfortunately Levy got played: the proposal he dedicates his article to is just another attempt to advance a ‘solution’ that doesn’t address the real technical or policy problems associated with developing a global backdoor system to our most personal electronic devices. Specifically the architect of the solution overestimates the existent security characteristics of contemporary devices,2 overestimates the ability of companies to successfully manage a sophisticated and globe-spanning key management system,3 fails to address international policy issues about why other governments couldn’t or wouldn’t demand similar kinds of access (think Russia, China, Iran, etc),4 fails to contemplate an adequate key revocation system, and fails to adequately explain why why the exceptional access system he envisions is genuinely needed. With regards to that last point, government agencies have access to more data than ever before in history and, yet, because they don’t have access to all of the data in existence the agencies are claiming they are somehow being ‘blinded’.

As I’ve written in a draft book chapter, for inclusion in a book published later this year or early next, the idea that government agencies are somehow worse off than in the past is pure nonsense. Consider that,

[a]s we have embraced the digital era in our personal and professional lives, [Law Enforcement and Security Agencies] LESAs have also developed new techniques and gained additional powers in order to keep pace as our memories have shifted from personal journals and filing cabinets to blogs, social media, and cloud hosting providers. LESAs now subscribe to services designed to monitor social media services for intelligence purposes, they collect bulk data from telecommunications providers in so-called ‘tower dumps’ of all the information stored by cellular towers, establish their own fake cellular towers to collect data from all parties proximate to such devices, use malware to intrude into either personal endpoint devices (e.g. mobile phones or laptops) or networking equipment (e.g. routers), and can even retroactively re-create our daily online activities with assistance from Canada’s signals intelligence agency. In the past, each of these kinds of activities would have required dozens or hundreds or thousands of government officials to painstakingly follow persons — many of whom might not be specifically suspected of engaging in a criminal activity or activity detrimental to the national security of Canada — and gain lawful entry to their personal safes, install cameras in their homes and offices, access and copy the contents of filing cabinets, and listen in on conversations that would otherwise have been private. So much of our lives have become digital that entirely new investigative opportunities have arisen which were previously restricted to the imaginations of science fiction authors both insofar as it is easier to access information but, also, because we generate and leave behind more information about our activities vis-a-vis our digital exhaust than was even possible in a world dominated by analog technologies.

In effect: the ‘solution’ covered by Levy doesn’t clearly articulate what problem must be solved and it would end up generating more problems than it solves by significantly diminishing the security properties of devices while, simultaneously, raising international policy issues of which countries’ authorities, and under what conditions, could lawfully obtain decryption keys. Furthermore, companies and their decryption keys will suddenly become even more targeted by advanced adversaries than they are today. Instead of even attempting to realistically account for these realities of developing and implementing secure systems, the proposed ‘solution’ depends on a magical pixie dust assumption that you can undermine the security of globally distributed products and have no bad things happen.5

The article as written by Levy (and the proposed solution at the root of the article) is exactly the kind of writing and proposal that gives law enforcement agencies the energy to drive a narrative that backdooring all secure systems is possible and that the academic, policy, and technical communities are merely ideologically opposed to doing so. As has become somewhat common to say, while we can land a person on the moon, that doesn’t mean we can also land a person on the sun; while we can build (somewhat) secure systems we cannot build (somewhat) secure systems that include deliberately inserted backdoors. Ultimately, it’s not the case that ‘privacy purists’ oppose such solutions to undermine the security of all devices on ideological grounds: they’re opposed based on decades of experience, training, and expertise that lets them recognize such solutions as the charades that they are.

Footnotes

  1. I am unaware of a single person in the American or international privacy advocacy space who was interviewed for the article, let alone espouses positions that would be pacified by the proposed solution.
  2. Consider that there is currently a way of bypassing the existing tamper-resistant chip in Apple’s iPhone, which is specifically designed to ‘short out’ the iPhone if someone attempts to enter an incorrect password too many times. A similar mechanism would ‘protect’ the master key that would be accessible to law enforcement and security agencies.
  3. Consider that Microsoft has, in the past, lost its master key that is used to validate copies of Windows as legitimate Microsoft-assured products and, also, that Apple managed to lose key parts of its iOS codebase and reportedly its signing key.
  4. Consider that foreign governments look at the laws promulgated by Western nations as justification for their own abusive and human rights-violating legislation and activities.
  5. Some of the more unhelpful security researchers just argue that if Apple et al. don’t want to help foreign governments open up locked devices they should just suspend all service into those jurisdictions. I’m not of the opinion that protectionism and nationalism are ways of advancing international human rights or of raising the qualities of life of all persons around the world; it’s not morally right to just cast the citizens of Russia, Ethiopia, China, India, Pakistan, or Mexico (and others!) to the wolves of their own oftentimes overzealous or rights abusing government agencies.
Link

(In)Security and Scruff

From The Verge:

Ashley: And then, you mentioned it in transit, do you store these on Scruff’s personal servers? When it’s on the server, is it encrypted? What kind of protections do you have on the server?

We take a number of steps to secure our network. Encryption is a multifaceted and multilayered question and process. Yeah, I can say that the technical architecture of Scruff is one that we have had very smart people look into. We’ve worked with security researchers and security experts to ensure that the data that’s on Scruff stays safe and that our members can use Scruff with confidence and know that their information isn’t going to be disclosed to unauthorized parties.

This is exactly the kind of answer that should set off alarm bells: the developer of Scruff doesn’t actually answer the specific and direction question about the company’s encryption policies in an equivalently direct and specific way. Maybe Scruff really does have strong security protocols in place but you certainly wouldn’t know that was the case based on the answer provided.

It’d be a great idea if someone were to develop the equivalent of the EFF’s or IX Maps’ scorecards, which evaluate the policies of digital and Internet companies, and apply it to online dating services. I wonder how well these services would actually fare when evaluated on their privacy and security and anti-harassment policies…

Link

WhatsApp’s new vulnerability is a concession, not a backdoor

The underlying weakness has to do with alerts rather than cryptography. Although they share the same underlying encryption, the Signal app isn’t vulnerable to the same attack. If the Signal client detects a new key, it will block the message rather than risk sending it insecurely. WhatsApp will send that message anyway. Since the key alert isn’t on by default, most users would have no idea.

It’s a controversial choice, but WhatsApp has good reasons for wanting a looser policy. Hard security is hard, as anyone who’s forgotten their PGP password can attest. Key irregularities happen, and each app has different policies on how to respond. Reached by The Guardian, WhatsApp pointed to users who change devices or SIM cards, the most common source of key irregularities. If WhatsApp followed the same rules as Signal, any message sent with an unverified key would simply be dropped. Signal users are happy to accept that as the price of stronger security, but with over a billion users across the world, WhatsApp is playing to a much larger crowd. Most of those users aren’t aware of WhatsApp’s encryption at all. Smoothing over those irregularties made the app itself simpler and more reliable, at the cost of one specific security measure. It’s easy to criticize that decision, and many have — but you don’t need to invoke a government conspiracy to explain it.

A multitude of secure messaging applications are vulnerable to keys being changed at the server level without the end-user being notified. This theoretically opens a way for state security agencies to ‘break into’ secured communications channels but, to date, we don’t have any evidence of a company in the Western or Western-affiliated world engaging in such behaviours.

There are laws that require some types of communications to be interceptable. Mobile communications carried by telecommunications carriers in Canada must be interceptable, and VoIP along with most other kinds of voice communications that are transmitted by equivalent carriers are subject to interception in the United States. There are not, however, similar demands currently placed on companies that provide chat or other next-generation communications system.

While there are not currently laws mandating either interception or decryption of chat or next-generation communications it remains plausible that laws will be introduced to compel this kind of functionality. It’s that possibility that makes how encryption keys are managed so important: as politicians smell that there is even the possibility of demanding decrypted communications the potential for such interception laws increases dramatically. Such laws would formalize and calcify vulnerabilities into the communications that we use everyday, to the effect of not just ensuring that domestic authorities could always potentially be listening, but foreign and unauthorized parties as well.

Link

Demand for secret messaging apps is rising as Trump takes office

From The Verge:

Marlinspike’s goal isn’t unicorn riches, but unicorn ubiquity. For that, he wants to make encrypted messaging as easy — as beautiful, as fun, as expressive, as emoji-laden — as your default messaging app. His reason: if encryption is difficult, it self-selects for people willing to jump through those hoops. And bad guys are always willing to jump through the hoops. “ISIS or high-risk criminal activity will be willing to click two extra times,” he told me. “You and I are not.”

Marlinspike’s protocol for secure communication is incredibly effective at protecting message content from third party observation. Few protocols are nearly as effective, however, and most chat companies now claim that they offer ‘secure’ communciations. Almost no consumers are situated to evaluate those claims: there are known deficient applications that are widely used, despite the security community having identified and discussed their problems. Encryption isn’t actually going to provide the security that most users think it does so unless the best-of-class protocols are widely adopted.1

The problem of imperfect consumer knowledge is a hard one to solve for, in part because the security community cannot evaluate all claims of encryption. In work that I’ve been involved in we’ve seen simplistic ciphers, hard coded passwords, and similar deficiencies. In some cases companies have asserted they secure data but then fail to encrypt data between smartphone apps and company servers. It’s laborious work to find these deficiencies and it’s cheap for companies to claim that they offer a ‘secure’ product. And it ultimately means that consumers (who aren’t experts in cryptography, nor should they be expected to be such experts) are left scratching their head and, sometimes, just throwing their hands up in frustration as a result of the limited information that is available.


  1. Admittedly, Marlinspike’s goal is to spread his protocol widely and the result has been that the largest chat service in the world, WhatsApp, not provides a robust level of communications security. To activate the protocol in other chat services, such as Google’s Allo or Facebook’s Messenger you need to first set up a private conversation. 

 

Link

150 Filmmakers Want Nikon and Canon to Sell Encrypted Cameras. Here’s Why

From Wired:

Implementing that feature wouldn’t be simple—particularly in high-definition cameras that have to write large files to an SD card at a high frequency, says Jonathan Zdziarski, an encryption and forensics expert who also works a semi-professional photographer. Integrating encryption without slowing down a camera would likely require not just new software, but new microprocessors dedicated to encrypting files with maximum efficiency, as well as security engineering talent that camera companies likely don’t yet have. He describes the process as “feasible,” but potentially expensive. “I don’t expect Nikon or Canon to know how to do this the way computer companies do. It’s a significant undertaking,” says Zdziarski. “Their first question is going to be, ‘how do we pay for that?‘”

Adding in encryption is a non-trivial undertaking. It’s one that is often done badly. And strong encryption – such that no party can access the content absent a passphrase – also has drawbacks because it you forget that phrase then you’re permanently locked out of the data. As someone who has suffered data loss for exactly that reason I’m incredibly sympathetic that the level of security proposed – opt-in strong security – is not necessarily something that most users want, nor something that most companies want to field support calls over.

Video

Privacy and Policing in a Digital World

As the federal government holds public consultations on what changes should be made to Bill C-51, the controversial anti-terrorism legislation passed by the Conservative government, various police agencies such as the RCMP and the Canadian Association of Chiefs of Police have petitioned to gain new powers to access telephone and internet data. Meanwhile nearly half of Canadians believe they should have the right to complete digital privacy. The Agenda examines the question of how to balance privacy rights with effective policing in the digital realm.

I was part of a panel that discussed some of the powers that the Government of Canada is opening for discussion as part of its National Security consultation, which ends on December 15, 2016. If you want to provide comments to the government, see: https://www.canada.ca/en/services/defence/nationalsecurity/consultation-national-security.html