Limits of Data Access Requests

rawpixel-378006-unsplash
Photo by rawpixel on Unsplash

A data access request involves you contacting a private company and requesting a copy of your personal information, as well as the ways in which that data is processed, disclosed, and the periods of time for which data is retained.

I’ve conducted research over the past decade which has repeatedly shown that companies are often very poor at comprehensively responding to data access requests. Sometimes this is because of divides between technical teams that collect and use the data, policy teams that determine what is and isn’t appropriate to do with data, and legal teams that ascertain whether collections and uses of data comport with the law. In other situations companies simply refuse to respond because they adopt a confused-nationalist understanding of law: if the company doesn’t have an office somewhere in a requesting party’s country then that jurisdiction’s laws aren’t seen as applying to the company, even if the company does business in the jurisdiction.

Automated Data Export As Solution?

Some companies, such as Facebook and Google, have developed automated data download services. Ostensibly these services are designed so that you can download the data you’ve input into the companies, thus revealing precisely what is collected about you. In reality, these services don’t let you export all of the information that these respective companies collect. As a result when people tend to use these download services they end up with a false impression of just what information the companies collect and how its used.

A shining example of the kinds of information that are not revealed to users of these services has come to light. A leaked document from Facebook Australia revealed that:

Facebook’s algorithms can determine, and allow advertisers to pinpoint, “moments when young people need a confidence boost.” If that phrase isn’t clear enough, Facebook’s document offers a litany of teen emotional states that the company claims it can estimate based on how teens use the service, including “worthless,” “insecure,” “defeated,” “anxious,” “silly,” “useless,” “stupid,” “overwhelmed,” “stressed,” and “a failure.”

This targeting of emotions isn’t necessarily surprising: in a past exposé we learned that Facebook conducted experiments during an American presidential election to see if they could sway voters. Indeed, the company’s raison d’être is figure out how to pitch ads to customers, and figuring out when Facebook users are more or less likely to be affected by advertisements is just good business. If you use the self-download service provided by Facebook, or any other data broker, you will not receive data on how and why your data is exploited: without understanding how their algorithms act on the data they collect from you, you can never really understand how your personal information is processed.

But that raison d’être of pitching ads to people — which is why Facebook could internally justify the deliberate targeting of vulnerable youth — ignores baseline ethics of whether it is appropriate to exploit our psychology to sell us products. To be clear, this isn’t a company stalking you around the Internet with ads for a car or couch or jewelry that you were browsing about. This is a deliberate effort to mine your communications to sell products at times of psychological vulnerability. The difference is between somewhat stupid tracking versus deliberate exploitation of our emotional state.1

Solving for Bad Actors

There are laws around what you can do with the information provided by children. Whether Facebook’s actions run afoul of such law may never actually be tested in a court or privacy commissioner’s decision. In part, this is because mounting legal challenges is extremely challenging, expensive, and time consuming. These hurdles automatically tilt the balance towards activities such as this continuing.

But part of the challenge in stopping such exploitative activities are also linked to Australia’s historically weak privacy commissioner as well as the limitations of such offices around the world: Privacy Commissioners Offices are often understaffed, under resourced, and unable to chase every legally and ethically questionable practice undertaken by private companies. Companies know about these limitations and, as such, know they can get away with unethical and frankly illegal activities unless someone talks to the press about the activities in question.

So what’s the solution? The rote advice is to stop using Facebook. While that might be good advice for some, for a lot of other people leaving Facebook is very, very challenging. You might use it to sign into a lot of other services and so don’t think you can easily abandon Facebook. You might have stored years of photos or conversations and Facebook doesn’t give you a nice way to pull them out. It might be a place where all of your friends and family congregate to share information and so leaving would amount to being excised from your core communities. And depending on where you live you might rely on Facebook for finding jobs, community events, or other activities that are essential to your life.

In essence, solving for Facebook, Google, Uber, and all the other large data broker problems is a collective action problem. It’s not a problem that is best solved on an individualistic basis.

A more realistic kind of advice would be this: file complaints to your local politicians. File complaints to your domestic privacy commissioners. File complaints to every conference, academic association, and industry event that takes Facebook money.2 Make it very public and very clear that you and groups you are associated with are offended by the company in question that is profiting off the psychological exploitation of children and adults alike.3 Now, will your efforts to raise attention to the issue and draw negative attention to companies and groups profiting from Facebook and other data brokers stop unethical data exploitation tomorrow? No. But by consistently raising our concerns about how large data brokers collect and use personal information, and attributing some degree of negative publicity to all those who benefit from such practices, we can decrease the public stock of a company.

History is dotted with individuals who are seen as standing up to end bad practices by governments and private companies alike. But behind them tend to be a mass of citizens who are supportive of those individuals: while standing up en masse may mean that we don’t each get individual praise for stopping some tasteless and unethical practices, our collective standing up will make it more likely that such practices will be stopped. By each working a little we can do something that, individually, we’d be hard pressed to change as individuals.

(This article was previously published in a slightly different format on a now-defunct Medium account.)

Footnotes:

1 Other advertising companies adopt the same practices as Facebook. So I’m not suggesting that Facebook is worst-of-class and letting the others off the hook.

2 Replace ‘Facebook’ with whatever company you think is behaving inappropriately, unethically, or perhaps illegally.

3 Surely you don’t think that Facebook is only targeting kids, right?

Link

Almost every Volkswagen sold since 1995 can be unlocked with an Arduino

Almost every Volkswagen sold since 1995 can be unlocked with an Arduino:

… security researchers have discovered how to use software defined radio (SDR) to remotely unlock hundreds of millions of cars. The findings are to be presented at a security conference later this week, and detail two different vulnerabilities.

The first affects almost every car Volkswagen has sold since 1995, with only the latest Golf-based models in the clear. Led by Flavio Garcia at the University of Birmingham in the UK, the group of hackers reverse-engineered an undisclosed Volkswagen component to extract a cryptographic key value that is common to many of the company’s vehicles.

Alone, the value won’t do anything, but when combined with the unique value encoded on an individual vehicle’s remote key fob—obtained with a little electronic eavesdropping, say—you have a functional clone that will lock or unlock that car.

Just implement the research by dropping some Raspberry Pi’s in a mid- to high-income condo parking garage and you’ve got an easy way to profit pretty handsomely from Volkswagen’s security FUBAR.

Link

An Internet Censorship Company Tried to Sue the Researchers Who Exposed Them

An Internet Censorship Company Tried to Sue the Researchers Who Exposed Them:

Netsweeper is a small Canadian company with a disarmingly boring name and an office nestled among the squat buildings of Waterloo, Ontario. But its services—namely, online censorship—are offered in countries as far-flung as Bahrain and Yemen.

In 2015, University of Toronto-based research hub Citizen Lab reported that Netsweeper was providing Yemeni rebels with censorship technology. In response, Citizen Lab director Ron Deibert revealed in a blog post on Tuesday, Netsweeper sued the university and Deibert for defamation. Netsweeper discontinued its lawsuit in its entirety in April.

 

Quote

The lack of teaching skills means we are supporting institutions that not only don’t do what we idealize them to do, they don’t value and professionalize the things that we expect them to do well. In fact, we have gone to extremes to prevent the job of university teaching from becoming a profession. The most obvious example is hiring adjunct professors. These are people who are hired for about the same wage as a fast food server, and are expected to teach physics or philosophy to 18 year olds. They don’t get benefits or even long-term contracts. So, in effect, they never get the chance to develop into highly skilled teaching professionals. Instead, they spend most of their time worrying about heating bills and whether they can afford to go to the doctor.

Now, of course, universities will argue that they are research organizations. And that is true. Universities do value research over teaching. Meaning that tenured and tenure-track professors, even if they love teaching, cannot prioritize it, because their administration requires them to be good researchers. Indeed, if you admit that you are a middling to average researcher and want to focus on teaching, you become viewed a burden by your department.

Yet, for the great majority of people, their only interaction with a university is through the people doing the teaching. It’s as if a major corporation, say General Motors, decided that their public face would not be their most visible product—hello Chevy Volt—and instead decides to place the janitorial service front and center. Then, just to top it off, decided not to train the janitors.

Link

The Murky State of Canadian Telecommunications Surveillance – The Citizen Lab

The most recent posting about our ongoing research into how, why, and how often Canadian ISPs disclose information to state agencies.

Quote

While such research is done in a number of countries, Canada seems to be a hotbed of boredom studies. James Danckert, an associate professor of psychology at the University of Waterloo, in Canada, recently conducted a study to compare the physiological effects of boredom and sadness.

To induce sadness in the lab, he used video clips from the 1979 tear-jerker, “The Champ,” a widely accepted practice among psychologists.

But finding a clip to induce boredom was a trickier task. Dr. Danckert first tried a YouTube video of a man mowing a lawn, but subjects found it funny, not boring. A clip of parliamentary proceedings was too risky. “There’s always the off chance you get someone who is interested in that,” he says.

I found the third paragraph particularly amusing as someone who often finds watching parliament interesting. I guess I’d be one of the ‘problem’ participants!