Quote

The ability to socialize with friends in private spaces without state interference is vital to citizens’ growth, the maintenance of society, and a free and healthy democracy. It ensures a zone of safety in which we can share personal information with the people that we choose, and still be free from state intrusion. Recognizing a right to be left alone in private spaces to which we have been invited is an extension of the principle that we are not subject to state interference any time we leave our own homes. The right allows citizens to move about freely without constant supervision or intrusion from the state. Fear of constant intrusion or supervision itself diminishes Canadians’ sense of freedom.

  • Factum for Tom Le, in Tom Le v The Queen, Court File No. 37971

A Civil Rights Company?

Photo by Youssef Sarhan on Unsplash

Much has been made of Tim Cook’s advocacy on issues of privacy and gay rights. The most recent iteration of Safari that was unveiled at WWDC will incorporate techniques that hinder, though won’t entirely stop, advertisers and websites from tracking users across the Internet. And Apple continues to support and promote gay rights; the most evident manifestations of this is Apple selling pride-inspired Apple Watch bands and a matching pride-based watch facealong with company’s CEO being an openly gay man.

It’s great that Apple is supporting these issues. But it’s equally important to reflect on Apple’s less rights-promoting activities. The company operates around the world and chooses to pursue profits to the detriment of the privacy of its China-based users. It clearly has challenges — along with all other smartphone companies — in acquiring natural mineral resources that are conflict-free; the purchase of conflict minerals raises fundamental human rights issues. And the company’s ongoing efforts to minimize its taxation obligations have direct impacts on the abilities of governments to provide essential services to those who are often the worst off in society.

Each of the above examples are easily, and quickly, reduced to assertions that Apple is a public company in a capitalist society. It has obligations to shareholders and, thus, can only do so much to advance basic rights while simultaneously pursuing profits. Apple is, on some accounts, actively attempting to enhance certain rights and promote certain causes and mitigate certain harms while simultaneously acting in the interests of its shareholders.

Those are all entirely fair, and reasonable, arguments. I understand them all. But I think that we’d likely all be well advised to consider Apple’s broader activities before declaring that Apple has ‘our’ backs, on the basis that ‘our’ backs are often privileged, wealthy, and able to externalize a range of harms associated with Apple’s international activities.

Limits of Data Access Requests

rawpixel-378006-unsplash
Photo by rawpixel on Unsplash

A data access request involves you contacting a private company and requesting a copy of your personal information, as well as the ways in which that data is processed, disclosed, and the periods of time for which data is retained.

I’ve conducted research over the past decade which has repeatedly shown that companies are often very poor at comprehensively responding to data access requests. Sometimes this is because of divides between technical teams that collect and use the data, policy teams that determine what is and isn’t appropriate to do with data, and legal teams that ascertain whether collections and uses of data comport with the law. In other situations companies simply refuse to respond because they adopt a confused-nationalist understanding of law: if the company doesn’t have an office somewhere in a requesting party’s country then that jurisdiction’s laws aren’t seen as applying to the company, even if the company does business in the jurisdiction.

Automated Data Export As Solution?

Some companies, such as Facebook and Google, have developed automated data download services. Ostensibly these services are designed so that you can download the data you’ve input into the companies, thus revealing precisely what is collected about you. In reality, these services don’t let you export all of the information that these respective companies collect. As a result when people tend to use these download services they end up with a false impression of just what information the companies collect and how its used.

A shining example of the kinds of information that are not revealed to users of these services has come to light. A leaked document from Facebook Australia revealed that:

Facebook’s algorithms can determine, and allow advertisers to pinpoint, “moments when young people need a confidence boost.” If that phrase isn’t clear enough, Facebook’s document offers a litany of teen emotional states that the company claims it can estimate based on how teens use the service, including “worthless,” “insecure,” “defeated,” “anxious,” “silly,” “useless,” “stupid,” “overwhelmed,” “stressed,” and “a failure.”

This targeting of emotions isn’t necessarily surprising: in a past exposé we learned that Facebook conducted experiments during an American presidential election to see if they could sway voters. Indeed, the company’s raison d’être is figure out how to pitch ads to customers, and figuring out when Facebook users are more or less likely to be affected by advertisements is just good business. If you use the self-download service provided by Facebook, or any other data broker, you will not receive data on how and why your data is exploited: without understanding how their algorithms act on the data they collect from you, you can never really understand how your personal information is processed.

But that raison d’être of pitching ads to people — which is why Facebook could internally justify the deliberate targeting of vulnerable youth — ignores baseline ethics of whether it is appropriate to exploit our psychology to sell us products. To be clear, this isn’t a company stalking you around the Internet with ads for a car or couch or jewelry that you were browsing about. This is a deliberate effort to mine your communications to sell products at times of psychological vulnerability. The difference is between somewhat stupid tracking versus deliberate exploitation of our emotional state.1

Solving for Bad Actors

There are laws around what you can do with the information provided by children. Whether Facebook’s actions run afoul of such law may never actually be tested in a court or privacy commissioner’s decision. In part, this is because mounting legal challenges is extremely challenging, expensive, and time consuming. These hurdles automatically tilt the balance towards activities such as this continuing.

But part of the challenge in stopping such exploitative activities are also linked to Australia’s historically weak privacy commissioner as well as the limitations of such offices around the world: Privacy Commissioners Offices are often understaffed, under resourced, and unable to chase every legally and ethically questionable practice undertaken by private companies. Companies know about these limitations and, as such, know they can get away with unethical and frankly illegal activities unless someone talks to the press about the activities in question.

So what’s the solution? The rote advice is to stop using Facebook. While that might be good advice for some, for a lot of other people leaving Facebook is very, very challenging. You might use it to sign into a lot of other services and so don’t think you can easily abandon Facebook. You might have stored years of photos or conversations and Facebook doesn’t give you a nice way to pull them out. It might be a place where all of your friends and family congregate to share information and so leaving would amount to being excised from your core communities. And depending on where you live you might rely on Facebook for finding jobs, community events, or other activities that are essential to your life.

In essence, solving for Facebook, Google, Uber, and all the other large data broker problems is a collective action problem. It’s not a problem that is best solved on an individualistic basis.

A more realistic kind of advice would be this: file complaints to your local politicians. File complaints to your domestic privacy commissioners. File complaints to every conference, academic association, and industry event that takes Facebook money.2 Make it very public and very clear that you and groups you are associated with are offended by the company in question that is profiting off the psychological exploitation of children and adults alike.3 Now, will your efforts to raise attention to the issue and draw negative attention to companies and groups profiting from Facebook and other data brokers stop unethical data exploitation tomorrow? No. But by consistently raising our concerns about how large data brokers collect and use personal information, and attributing some degree of negative publicity to all those who benefit from such practices, we can decrease the public stock of a company.

History is dotted with individuals who are seen as standing up to end bad practices by governments and private companies alike. But behind them tend to be a mass of citizens who are supportive of those individuals: while standing up en masse may mean that we don’t each get individual praise for stopping some tasteless and unethical practices, our collective standing up will make it more likely that such practices will be stopped. By each working a little we can do something that, individually, we’d be hard pressed to change as individuals.

(This article was previously published in a slightly different format on a now-defunct Medium account.)

Footnotes:

1 Other advertising companies adopt the same practices as Facebook. So I’m not suggesting that Facebook is worst-of-class and letting the others off the hook.

2 Replace ‘Facebook’ with whatever company you think is behaving inappropriately, unethically, or perhaps illegally.

3 Surely you don’t think that Facebook is only targeting kids, right?

Link

Privacy Enhancing Technologies – A Review of Tools and Techniques

From the Office of the Privacy Commissioner of Canada:

PETs are a category of technologies that have not previously been systematically studied by the Office of the Privacy Commissioner of Canada (OPC). As a result, there were some gaps in our knowledge of these tools and techniques. In order to begin to address these gaps, a more systematic study of these tools and techniques was undertaken, starting with a (non-exhaustive) review of the general types of privacy enhancing technologies available. This paper presents the results of that review.

While Privacy Enhancing Technologies (PETs) have been around for a long time there are only some which have really taken hold over time, and usually only as a result of there being a commercial incentive for companies to integrate the enhancements.

Some of the failures of PETs to be widely adopted have stemmed from the reasons specific PETs were created (to effectively forestall formal regulatory or legislative action), others because of their complexity (you shouldn’t need a graduate degree to configure your tools properly!), and yet others because the PETs in question were built by researchers and not intended for commercialization.

The OPC’s review of dominant types of PETs is good and probably represents the most current of reviews. But the specific categories of tools, types of risks, and reasons PETs have failed to really take hold have largely been the same for a decade. We need to move beyond research and theory and actually do something soon given that data is leaking faster and further than ever before, and the rate of leakage and dispersal is only increasing.

Link

Confidentiality in an Era of Patient-Doctor-Cop

From The Canadian Press:

Doctors at Royal Columbian Hospital in New Westminster have complained that local police and RCMP officers are routinely recording conversations without consent between doctors and patients who are considered a suspect in a crime.

“They will be present when we are trying to question the patients and trying to obtain a history of what happened,” said Tony Taylor, an emergency physician who practises at the hospital.

“They have now recently started recording these conversations and often they will do that unannounced, which has a number of implications around confidentiality and consent.”

As far as doctors at Royal Columbian are concerned, the police are getting in the way of patient care.

Patients tend to clam up when police officers are present, Dr. Taylor said. “That makes it difficult to get those kind of history details that are critically important,” he said.

The idea that the police are present, and recording interactions between a doctor and patient, is patently problematic from a procedural fairness perspective. In the past the authorities have lost Charter challenges based on their attempts to exploit Canada’s one-person consent doctrine; I’d be very curious to know the legal basis for their recording persons who may be accused of a crime, in a setting clearly designated as deserving heightened privacy protections, and the extent to which that legal theory holds up under scrutiny.

Link

The London Tube Is Tracking Riders with Their Phones

From Wired:

An agency like TfL could also use uber-accurate tracking data to send out real-time service updates. “If no passengers are using a particular stairway, it could alert TfL that there’s something wrong with the stairway—a missing step or a scary person,” Kaufman says. (Send emergency services stat.)

The Underground won’t exactly know what it can do with this data until it starts crunching the numbers. That will take a few months. Meanwhile, TfL has set about quelling a mini-privacy panic—if riders don’t want to share data with the agency, Sager Weinstein recommends shutting off your mobile device’s Wi-Fi.

So, on the one hand, they’ll apply norms and biases to ascertain why their data ‘says’ certain things. But to draw these conclusion the London transit authority will collect information from customers and the only way to disable this collection is to reduce the functionality of your device when you’re in a public space. Sounds like a recipe for great consensual collection of data and subsequent data ‘analysis’.

Video

Privacy and Policing in a Digital World

As the federal government holds public consultations on what changes should be made to Bill C-51, the controversial anti-terrorism legislation passed by the Conservative government, various police agencies such as the RCMP and the Canadian Association of Chiefs of Police have petitioned to gain new powers to access telephone and internet data. Meanwhile nearly half of Canadians believe they should have the right to complete digital privacy. The Agenda examines the question of how to balance privacy rights with effective policing in the digital realm.

I was part of a panel that discussed some of the powers that the Government of Canada is opening for discussion as part of its National Security consultation, which ends on December 15, 2016. If you want to provide comments to the government, see: https://www.canada.ca/en/services/defence/nationalsecurity/consultation-national-security.html