Categories
Writing

Uber and the Limits of Privacy Law

When was the last time that you thought long and hard about the information companies are collecting, sharing, and selling about you? Maybe you thought about it after reading some company had suffered a data breach or questionably used your data, and then set the worries out of your mind.

What you may not know is that most contemporary Western nation-states have established data protection and privacy legislation over the past several decades. A core element of these laws include data access rights: the right for individuals to compel companies to disclose what information the companies have collected, stored, and shared about them.

In Canada, federal commercial privacy legislation lets Canadian citizens and residents request their personal information. They can use an online application to make those requests to telecommunications companies, online dating companies, or fitness wearable companies. Or they can make requests themselves to specific companies on their own.

So, what happens when you make a request to a ride sharing company? A company like Uber? It might surprise you but they tend to provide you with a lot of information about you, pretty quickly, and in surprisingly digestible formats. You can see when you used a ride sharing application to book a ride, the coordinates of the pickup, where you were dropped off, and so forth.

But you don’t necessarily get all of the information that ride sharing companies collect about you. In the case of Uber, the company was recently found to be fingerprinting the phones its application was installed on. There’s some reason to believe that this was for anti-fraud purposes but, regardless, the collection of that information arguably constitutes the collection of personal information. Per Canadian privacy legislation, such information is defined as “information about an identifiable individual” and decisions by the Commissioner have found that if there is even an instant where machine identifiers are linked with identifiable subscriber data, that the machine identifiers also constitute personal information. Given that Uber was collecting the fingerprints while the application was installed, it likely was linking those fingerprints with subscriber data, even if only momentarily before subsequently separating the identifiers and other data.

So if Uber had a legal duty to inform individuals about the personal information that it collected, and failed to do so, what is the recourse? Either the Federal Office of the Privacy Commissioner of Canada could launch an investigation or someone who requested their personal information from Uber could file a formal complaint with the Office. That complaint would, pretty simply, argue that Uber had failed to meet its legal obligations by not disclosing the tracking information.

But even if Uber was found to have violated Canadian law there isn’t a huge amount of recourse for affected individuals. There aren’t any fines that can be levied by the Canadian federal commissioner. And Uber might decide that it doesn’t want to implement any recommendations that Privacy Commissioner provided: in Canada, to enforce an order, a company has to be taken to court. Even when companies like Facebook have received recommendations they have selectively implemented them and ignored those that would impact their business model. So ‘enforcement’ tends to be limited to moral suasion when applied by the federal privacy commissioner.1

But the limits of enforcement only strike to a part of the problem. What is worse is we only know about Uber’s deceptive practices because of journalism. It isn’t because the company was forthcoming and proactively disclosed this information well-in-advance of fingerprinting devices. Other companies can read that signal and know that they can probably engage in questionable and unlawful practices and have a pretty low expectation of being caught or punished.

In a recent article published by a summer fellow for the Citizen Lab, Adrian Fong argued that enforcing data protection and privacy laws on individual private companies is likely an untenable practice. Too few companies will be able to figure out how to deal with data access requests, fewer will be inclined to respond to them, and even fewer will understand whether they are obligated to respond to such requests or not in the first place. Instead, Fong argues that application stores — such as Google’s and Apple’s respective App stores — could include comprehensive data access rights as part of the contracts that app developers agree to with the app store owners. Failure to comply with the data access rights aspect of a contract could lead to an app being removed from the app store. Were Google and Apple to seriously implement such a practice then their ability to remove bad actors, such as Uber, from app stores could lead to a modification of business practices.

Ultimately, however, I’m not certain that the ‘solution’ to Uber is better privacy law. It’s probably not even just better regulation. Rather, ‘solving’ for companies like Uber demands changing how engineers and business persons are educated and trained, and modifying the grounds under which they’re rewarded and punished for their actions. Greater emphases on ethical practices and the politics of code need to be ingrained in their respective educational curriculum, just as arts and humanities students should be exposed in more depth to the hard sciences. And engineers, generally, need to learn that they’re not just solving hard problems such as preventing fraudulent rides: they’re also embedding power structures in the code they develop, and those structures can’t just run roughshod over the law that democratic publics have established to govern private behaviours. Or, at least, if they run afoul of the law — be it national data protection law or contract law — there will at least be serious consequences. Doing otherwise will simply incentivize companies to act unethically on the basis that there are few, or no, consequences for behaving like a bad actor.

NOTE: this was originally posted to Medium.


  1. 1 Some of Canada’s provincial commissioners do have order making powers. ↩︎
Categories
Links

How Canada’s Anti-Cyberbullying Law Is Being Used to Spy on Journalists

From Motherboard:

According to Citizen Lab researcher Christopher Parsons, these same powers that target journalists can be used against non-journalists under C-13. And the only reason we know about the aforementioned cases is that the press has a platform to speak out.

“This is an area where transparency and accountability are essential,” Parsons said in an interview. “We’ve given piles and piles of new powers to law enforcement and security agencies alike. What’s happened to this journalist shows we desperately need to know how the government uses its powers to ensure they’re not abused in any way.”

“I expect that the use of these particular powers will become more common as the police get more used to using it and more savvy in using them,” Parsons said.

These were powers that were ultimately sold to the public (and passed into law) as needed to ‘child pornography’. And now they’re being used to snoop on journalists to figure out who their sources are, without being mandated to report on the regularity at which the powers are used to the efficacy of such uses. For some reason, this process doesn’t inspire a lot of confidence in me.

Categories
Links

Canada’s National Security Consultation: Digital Anonymity & Subscriber Identification Revisited… Yet Again – Technology, Thoughts & Trinkets

Over at Technology, Thoughts, and Trinkets I’ve written that:

Last month, Public Safety Canada followed through on commitments to review and consult on Canada’s national security framework. The process reviews powers that were passed into law following the passage of Bill C-51, Canada’s recent controversial anti-terrorism overhaul, as well as invite a broader debate about Canada’s security apparatus. While many consultation processes have explored expansions of Canada’s national security framework, the current consultation constitutes the first modern day attempt to explore Canada’s national security excesses and deficiencies. Unfortunately, the framing of the consultation demonstrates minimal direct regard for privacy and civil liberties because it is primarily preoccupied with defending the existing security framework while introducing a range of additional intrusive powers. Such powers include some that have been soundly rejected by the Canadian public as drawing the wrong balance between digital privacy and law enforcement objectives, and heavily criticized by legal experts as well as by all of Canada’s federal and provincial privacy commissioners.

The government has framed the discussion in two constituent documents, a National Security Green Paper and an accompanying Background Document. The government’s framings of the issues are highly deficient. Specifically, the consultation documents make little attempt to explain the privacy and civil liberties implications that can result from the contemplated powers. And while the government is open to suggestions on privacy and civil liberties-enhancing measures, few such proposals are explored in the document itself. Moreover, key commitments, such as the need to impose judicial control over Canada’s foreign intelligence agency (CSE) and regulate the agency’s expansive metadata surveillance activities, are neither presented nor discussed (although the government has mentioned independently that it still hopes to introduce such reforms). The consultation documents also fail to provide detailed suggestions for improving government accountability and transparency surrounding state agencies’ use of already-existent surveillance and investigative tools.

In light of these deficiencies, we will be discussing a number of the consultation document’s problematic elements in a series of posts, beginning with the government’s reincarnation of a highly controversial telecommunication subscriber identification power.

I wrote the first of what will be many analyses of the Canadian government’s national security consultation with a good friend and colleague, Tamir Israel.

The subscriber identification powers we write about are not really intended for national security but will, instead, be adopted more broadly by law enforcement so they can access the data indiscriminately. Past legislative efforts have rejected equivalent powers: it remains to be seen if the proposal will (once more) be successfully rejected, or whether this parliament will actually establish some process or law that lets government agencies get access to subscriber identification information absent a warrant.

Categories
Aside Links

France’s Emergency Powers: The New Normal

Just Security:

The new, six-month extension of emergency powers creates France’s longest state of emergency since the Algerian War in the 1950s. The new law restores or extends previous emergency provisions, such as empowering police to carry out raids and local authorities to place suspects under house arrest without prior judicial approval. It also expands those powers, for example allowing the police to search luggage and vehicles without judicial warrants. In addition it reinstates warrantless seizures of computer and cellphone data that France’s highest legal authority had struck down as unconstitutional, adding a few restrictions that still fall short of judicial oversight.

In separate reports in February, Human Rights Watch and Amnesty International documented more than three dozen cases in which the use of these emergency powers violated universal rights to liberty, privacy, or freedoms of movement, association and expression. The two groups also found that the emergency acts lost suspects jobs, traumatized children, and damaged homes. The vast majority of those targeted were Muslims. Those interviewed said the actions left them feeling stigmatized and eroded their trust in the French authorities. The latest version of the emergency law risks compounding these effects.

The decisions to advance unconstitutional and discriminatory ‘security’ laws and policies following serious crimes threaten to undermine democracies while potentially strengthening states. But worryingly there are fewer and fewer loud voices for the rough and tumble consequences of maintaining a democratic form of governance as opposed to those who assert that a powerful state apparatus is needed if normalcy is to exist. The result may be the sleepwalking from governments for and by the people, to those that protect citizen-serfs and harshly discriminate against difference.

Categories
Links

For some safety experts, Uber’s self-driving taxi test isn’t something to hail

Washington Post:

Even so, the effort is raising concern from safety experts who say the technology has major limitations that can be very dangerous. Self-driving cars have trouble seeing in bad weather. Sudden downpours, snow and especially puddles make it difficult for autonomous vehicles to detect lines on pavement and thereby stay in one lane.

Walker Smith added that self-driving cars have sometimes confused bridges for other obstacles. “People need to understand both the potential and the limitations of these systems, and inviting them inside is part of that education,” he said.

The vehicles also have difficulty understanding human gestures — for example, a crosswalk guard in front of a local elementary school may not be understood, said Mary Cummings, director of Duke University’s Humans and Autonomy Lab, at a Senate hearing in March. She recommended that the vehicles not be allowed to operate near schools.

Then there’s a the human factor: Researchers have shown that people like to test and prank robots. Today, a GPS jammer, which some people keep in their trunks to block police from tracking them, will easily throw off a self-driving car’s ability to sense where it is, Cummings said.

Current self-driving cars often cannot see which lane they’re in, if it’s raining. They don’t understand what a bridge is versus other road-terrain. They don’t understand what a cross-walk guard is. And they are reliant on a notoriously brittle location technology.

What can go wrong with testing them in urban centres then, exactly?

Categories
Links

Police Using Journalists’ Metadata to Hunt Down Whistleblowers

Police Using Journalists’ Metadata to Hunt Down Whistleblowers:

In the past year, the Australian Federal Police has been asked to investigate a piece in The Australian about the Government’s’ leaked Draft Defence White Paper, and a Fairfax Media story on a proposal to reform to citizenship laws.

Just last week, police raided Parliament House in an attempt to track down the source of an embarrassing leak about the National Broadband Network. It’s feared that these investigations, along with increased penalties for whistleblowers, are hindering the ability of journalists to hold policymakers to account.

It was with this in mind that the Opposition eventually voted for the amendments that created the Journalist Information Warrant scheme, and allowed the Data Retention laws to pass last year. In a last minute effort to shore up support for the legislation, the Government agreed to add provisions for ‘safeguards’ that would, in theory, prevent the scheme being used to target journalists’ sources. However, a closer look at the scheme reveals its flaws.

When a democracy creates warranting schemes solely to determine who is willing to speak with journalists, the democracy is demonstrably in danger of slipping free of the grasp of the citizenry.

Categories
Writing

The Fourth Amendment in the Information Age

Litt’s article focuses on finding new ways of conceptualizing privacy such that the current activities of intelligence agencies and law enforcement organizations are made legal, and thus shift the means by which their activities are legally and constitutionally evaluated. While his proposal to overturn much of the third-party doctrine coheres with the positions of many contemporary scholars his suggested replacement — that we should no longer focus on collecting data, but on use of collected data — would eviscerate basic privacy protections. In particular, I think that it’s important we not just ignore the ‘search’ aspect of fourth amendment law: we need to recalibrate what a search is within the context of today’s reality. And that doesn’t mean just letting the government collect with fewer baseline restrictions but instead modifying what a ‘search’ is itself.

The core aspects of the article that give a flavour of the entire argument are:

I suggest that—at least in the context of government acquisition of digital data—we should think about eliminating the separate inquiry into whether there was a “reasonable expectation of privacy” as a gatekeeper for Fourth Amendment analysis. In an era in which huge amounts of data are flowing across the Internet; in which people expose previously unimagined quantities and kinds of information through social media; in which private companies monetize information derived from search requests and GPS location; and in which our cars, dishwashers, and even light bulbs are connected to the Internet, trying to parse out the information in which we do and do not have a reasonable expectation of privacy strikes me as a difficult and sterile task of line-drawing. Rather, we should simply accept that any acquisition of digital information by the Government implicates Fourth Amendment interests.

After all, the concept of a “reasonable expectation of privacy” as a talisman of Fourth Amendment protection is not found in the text of the Fourth Amendment itself, which says merely that “[t]he right of the people to be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures, shall not be violated.” It was only in 1967, in Katz, that the Supreme Court defined a search as the invasion of a “reasonable expectation of privacy.” Katz revisited Olmstead v. United States after 40 years; the accelerating pace of modern technological change suggests to me that fifty years is not too soon to revisit Katz. My proposal is that the law should focus on determining what is unreasonable rather than on what is a search.

What I have suggested, however, is that—at least in the area of government collection of digital data—we eliminate the preliminary analysis of whether someone has a reasonable expectation of privacy in the data and proceed directly to the issue of whether the collection is reasonable; that the privacy side of that analysis should be focused on concrete rather than theoretical invasions of privacy; and that courts in evaluating reasonableness should look at the entirety of the government’s activity, including the “back end” use, retention restrictions, and the degree of transparency, not just the “front end” activity of collection.

Categories
Aside Links

With Remote Hacking, the Government’s Particularity Problem Isn’t Going Away

Crocker’s article is a defining summary of the legal problems associated with the U.S. Government’s attempts to use malware to conduct lawful surveillance of persons suspected of breaking the law. He explores how even after the law is shifted to authorize magistrates to issue warrants pertaining to persons outside of their jurisdictions, broader precedent concerning wiretaps may prevent the FBI or other actors from using currently-drafted warrants to deploy malware en masse. Specifically, the current framework adopted might violate basic constitutional guarantees that have been defined in caselaw over the past century, to the effect of rendering mass issuance of malware an unlawful means of surveillance.

Categories
Links

Cybercrime Overtakes Traditional Crime in UK

Cybercrime Overtakes Traditional Crime in UK:

The NCA’s Cyber Crime Assessment 2016, released July 7, 2016, highlights the need for stronger law enforcement and business partnership to fight cybercrime. According to the NCA, cybercrime emerged as the largest proportion of total crime in the U.K., with “cyber enabled fraud” making up 36 percent of all crime reported, and “computer misuse” accounting for 17 percent.

“The ONS estimated that there were 2.46 million cyber incidents and 2.11 million victims of cyber crime in the U.K. in 2015,” the report’s authors wrote. “These figures highlight the clear shortfall in established reporting, with only 16,349 cyber dependent and approximately 700,000 cyber-enabled incidents reported to Action Fraud over the same period.”

While there is a persistent issue associated with counting ‘cyber’ events, that UK organizations are highlighting this kind of fraud and espionage so prominently does indicate a real problem is being faced by organizations.

Categories
Links

German woman who claimed rape fined $34,800 as judge rules she said no to the filming, not sex act itself

In Germany it isn’t enough to say ‘no’ during intercourse: a person must actively resist, and that resistance be overcome, for the person to legally claim to have been raped. As a result of this Germanic understanding of rape a woman who alleges she was raped was found by a judge to have falsely accused her attackers and, as a result, led to renewed calls in the country to update its sexual assault and abuse laws.