Quote

We have come a long way in routing the taboos that stand in the way of justice for victims of sexual assault. But there is still a distance to go. The problems are complex and rooted in centuries of culture and myth. The law, imperfect as it may be, is a powerful tool in achieving lasting change. But real justice will come only when we change attitudes—when respect for the autonomy of every person replaces old myths grounded in ownership, control, and power.

– Beverly McLachlin, Truth Be Told: My Journey Through Life and the Law

Two Thoughts on China’s Draft Privacy Law

Alexa Lee, Samm Sacks, Rogier Creemers, Mingli Shi, and Graham Webster have collectively written a helpful summary of the new Chinese Data Privacy Law over at Stanford’s DigiChina.

There were a pair of features that most jump out to me.

First, that the proposed legislation will compel Chinese companies “to police the personal data practices across their platforms” as part of Article 57. As noted by the team at Stanford,

“the three responsibilities identified for big platform companies here resonate with the “gatekeeper” concept for online intermediaries in Europe, and a requirement for public social responsibility reports echoes the DMA/DSA mandate to provide access to platform data by academic researchers and others. The new groups could also be compared with Facebook’s nominally independent Oversight Board, which the company established to review content moderation decisions.”

I’ll be particularly curious to see the kinds of transparency reporting that emerges out of these companies. I doubt the reports will parallel those in the West, which tend to focus on the processes and number of disclosures from private companies to government and, instead, the Chinese companies’ reports will focus on how companies are being ‘socially responsible’ with how they collect, process, and disclose data to other Chinese businesses. Still, if we see this more consumer-focused approach it will demonstrate yet another transparency report tradition that will be useful to assess in academic and public policy writing.

Second, the Stanford team notes that,

“new drafts of both the PIPL and the DSL added language toughening requirements for Chinese government approval before data holders in China cooperate with foreign judicial or law enforcement requests for data, making failure to gain permission a clear violation punishable by financial penalties up to 1 million RMB.”

While not surprising, this kind of restriction will continue to raise data sovereignty borders around personal information held in China. The effect? Western states will still need to push for Mutual Legal Assistant Treaty (MLAT) reform to successfully extract information from Chinese companies (and, perhaps in all likelihood, fail to conclude these reforms).1

It’s perhaps noteworthy that while China is moving to build up walls there is a simultaneous attempt by the Council of Europe to address issues of law enforcement access to information held by cloud providers (amongst other things). The United States passed the CLOUD Act in 2018 to begin to try and alleviate the issue of states gaining access to information held by cloud providers operating in foreign jurisdictions (though did not address human rights concerns which were mitigated through traditional MLAT processes). Based on the proposed Chinese law, it’s unlikely that the CLOUD Act will gain substantial traction with the Chinese government, though admittedly this wasn’t the aim of the CLOUD Act or an expected outcome of its passage.

Nevertheless, as competing legal frameworks are established that place the West on one side, and China and Russia on the other, the effect will be further entrenching the legal cultures of the Internet between different economic and political (and security) regimes. At the same time, data will be easily stored anywhere in the world including out of reach of relevant law enforcement agencies by criminal actors that routinely behave with technical and legal savvy.

Ultimately, the raising of regional and national digital borders is a topic to watch, both to keep an eye on what the forthcoming legal regimes will look like and, also, to assess the extents to which we see languages of ‘strong sovereignty’ or nationalism creep functionally into legislation around the world.


  1. For more on MLAT reform, see these pieces from Lawfare ↩︎
Link

Privacy and Contemporary Motorvehicles

Writing for NBC News, Olivia Solon provides a useful overview of just how much data is collected by motor vehicles—using sensors embedded in the vehicles as well as collected by infotainment systems when linked with a smartphone—and how law enforcement agencies are using that information.

Law enforcement agencies have been focusing their investigative efforts on two main information sources: the telematics system — which is like the “black box” — and the infotainment system. The telematics system stores a vehicle’s turn-by-turn navigation, speed, acceleration and deceleration information, as well as more granular clues, such as when and where the lights were switched on, the doors were opened, seat belts were put on and airbags were deployed.

The infotainment system records recent destinations, call logs, contact lists, text messages, emails, pictures, videos, web histories, voice commands and social media feeds. It can also keep track of the phones that have been connected to the vehicle via USB cable or Bluetooth, as well as all the apps installed on the device.

Together, the data allows investigators to reconstruct a vehicle’s journey and paint a picture of driver and passenger behavior. In a criminal case, the sequence of doors opening and seat belts being inserted could help show that a suspect had an accomplice.

Of note, rental cars as well as second hand vehicles also retain all of this information and it can then be accessed by third-parties. It’s pretty easy to envision a situation where rental companies are obligated to assess retained data to determine if a certain class or classes of offences have been committed, and then overshare information collected by rental vehicles to avoid their own liability that could follow from failing to fully meet whatever obligations are placed upon them.

Of course, outright nefarious actors can also take advantage of the digital connectivity built into contemporary vehicles.

Just as the trove of data can be helpful for solving crimes, it can also be used to commit them, Amico said. He pointed to a case in Australia, where a man stalked his ex-girlfriend using an app that connected to her high-tech Land Rover and sent him live information about her movements. The app also allowed him to remotely start and stop her vehicle and open and close the windows.

As in so many different areas, connectivity is being included into vehicles without real or sufficient assessment of how to secure new technologies and defray harmful or undesirable secondary uses of data. Engineers rarely worry about these outcomes, corporate lawyers aren’t attentive to these classes of issues, and the security of contemporary vehicles is generally garbage. Combined, this means that government bodies are almost certainly going to expand the ranges of data they can access without having to first go through a public debate about the appropriateness of doing so or creation of specialized warrants that would limit data mining. Moreover, in countries with weak policing accountability structures, it will be impossible to even assess the regularity at which government officials obtain access to information from cars, how such data lets them overcome other issues they state they are encountering (e.g., encryption), or the utility of this data in investigating crimes and introducing it as evidence in court cases.

Quote

If those responsible for security believe that the law does not give them enough power to protect security effectively, they must try to persuade the law-makers, Parliament and the provincial legislatures, to change the law. They must not take the law into their own hands. This is a requirement of a liberal society.

  • Canada, Commission of Inquiry Concerning Certain Activities of the Royal Canadian Mounted Police, Second Report: Freedom and Security Under the Law, vol 1, Part II (Ottawa: Privy Council Office, 1981) at 45.

Aside

2018.2.15

As I return from an event I was invited to I have to reflect on, and admit, how profoundly…weird…it is that stuff I write about and the activities in which I’m engaged increasingly influence the course of justice in my county. How weird it is that the leader of my country is briefed on the work that I and my colleagues write about. How it feels epically strange that things which seem to have no impact on public debate whatsoever reverberate behind closed doors. It’s just really, really weird to know that people who are intrinsically involved with law, security, and justice — to say nothing of policy and politics — closely watch what I do, with the intent of using it when making decisions that may affect the lives of people across Canada, and around the world.

When I was doing my PhD I laughed out loud at my colleagues who spoke of how the work of political scientists can lead to exceptional impacts in the worlds. As a philosopher I thought such conversations were borne of a group of people who took themselves too seriously in their (ongoing) moments of hubris. But I get it now: that which we say, when we’re deliberately involved with public debate with an eye to inform (if not influence) policy can have unexpected and exciting and unintended impacts on the lives of millions of people. And in living this reality I have remarkably more sympathy for those who’s work isn’t just read and taken up, but misread and subsequently misappropriated to justify governmental activities that the political scientists in question might not have anticipated or endorsed.

Link

The Insanity of ‘Terrorism’ Offences

The Fool by Christopher Parsons, All Rights Reserved

Via The Intercept:

At the end of a quick one-day trial, Judge Emma Arbuthnot at Westminster Magistrates Court ruled that Rabbani had willfully obstructed police when he declined to hand over his passwords. Rabbani avoided a possible three-month jail term and was instead handed a 12-month conditional discharge and told he must pay court costs of £620 ($835). This means a Terrorism Act offense will be recorded on his criminal record. But as long as he does not re-offend within the 12-month period, no further action will be taken against him.

Rabbani had argued his electronic devices should have been protected under the latter category, as they contained confidential information related to his work. The judge said that Rabbani did not make this clear to the officers who initially interrogated him, but did say so later in a prepared statement following his arrest. She described Rabbani as “of good character,” acknowledged he was “trying to protect confidential material on his devices,” and noted that “the importance of passwords and PIN numbers in the 21st century cannot be overstated.” However, she still concluded that his “decision not to provide the information when requested by the examining officers” amounted to “a wilful obstruction of the lawful examination in the circumstances.”

A lawyer was charged and found guilty of a terrorism offence for refusing to decrypt a device containing sensitive client information. A baseline part of the criminal justice system is that what is said between a client and their lawyer is protected speech, but this protection is under threat in the UK: solicitors who do their duty and uphold the oaths to their clients risk serious convictions that may permanently refigure their lives and liberties. This dismantling of baseline aspects of our legal systems to fight ‘terrorism’ are ludicrous and do more harm to our societies than can be inflicted upon us by violent extremists and criminals.

Link

How Canada’s Anti-Cyberbullying Law Is Being Used to Spy on Journalists

From Motherboard:

According to Citizen Lab researcher Christopher Parsons, these same powers that target journalists can be used against non-journalists under C-13. And the only reason we know about the aforementioned cases is that the press has a platform to speak out.

“This is an area where transparency and accountability are essential,” Parsons said in an interview. “We’ve given piles and piles of new powers to law enforcement and security agencies alike. What’s happened to this journalist shows we desperately need to know how the government uses its powers to ensure they’re not abused in any way.”

“I expect that the use of these particular powers will become more common as the police get more used to using it and more savvy in using them,” Parsons said.

These were powers that were ultimately sold to the public (and passed into law) as needed to ‘child pornography’. And now they’re being used to snoop on journalists to figure out who their sources are, without being mandated to report on the regularity at which the powers are used to the efficacy of such uses. For some reason, this process doesn’t inspire a lot of confidence in me.

Link

Canada’s National Security Consultation: Digital Anonymity & Subscriber Identification Revisited… Yet Again – Technology, Thoughts & Trinkets

Over at Technology, Thoughts, and Trinkets I’ve written that:

Last month, Public Safety Canada followed through on commitments to review and consult on Canada’s national security framework. The process reviews powers that were passed into law following the passage of Bill C-51, Canada’s recent controversial anti-terrorism overhaul, as well as invite a broader debate about Canada’s security apparatus. While many consultation processes have explored expansions of Canada’s national security framework, the current consultation constitutes the first modern day attempt to explore Canada’s national security excesses and deficiencies. Unfortunately, the framing of the consultation demonstrates minimal direct regard for privacy and civil liberties because it is primarily preoccupied with defending the existing security framework while introducing a range of additional intrusive powers. Such powers include some that have been soundly rejected by the Canadian public as drawing the wrong balance between digital privacy and law enforcement objectives, and heavily criticized by legal experts as well as by all of Canada’s federal and provincial privacy commissioners.

The government has framed the discussion in two constituent documents, a National Security Green Paper and an accompanying Background Document. The government’s framings of the issues are highly deficient. Specifically, the consultation documents make little attempt to explain the privacy and civil liberties implications that can result from the contemplated powers. And while the government is open to suggestions on privacy and civil liberties-enhancing measures, few such proposals are explored in the document itself. Moreover, key commitments, such as the need to impose judicial control over Canada’s foreign intelligence agency (CSE) and regulate the agency’s expansive metadata surveillance activities, are neither presented nor discussed (although the government has mentioned independently that it still hopes to introduce such reforms). The consultation documents also fail to provide detailed suggestions for improving government accountability and transparency surrounding state agencies’ use of already-existent surveillance and investigative tools.

In light of these deficiencies, we will be discussing a number of the consultation document’s problematic elements in a series of posts, beginning with the government’s reincarnation of a highly controversial telecommunication subscriber identification power.

I wrote the first of what will be many analyses of the Canadian government’s national security consultation with a good friend and colleague, Tamir Israel.

The subscriber identification powers we write about are not really intended for national security but will, instead, be adopted more broadly by law enforcement so they can access the data indiscriminately. Past legislative efforts have rejected equivalent powers: it remains to be seen if the proposal will (once more) be successfully rejected, or whether this parliament will actually establish some process or law that lets government agencies get access to subscriber identification information absent a warrant.

Link

France’s Emergency Powers: The New Normal

Just Security:

The new, six-month extension of emergency powers creates France’s longest state of emergency since the Algerian War in the 1950s. The new law restores or extends previous emergency provisions, such as empowering police to carry out raids and local authorities to place suspects under house arrest without prior judicial approval. It also expands those powers, for example allowing the police to search luggage and vehicles without judicial warrants. In addition it reinstates warrantless seizures of computer and cellphone data that France’s highest legal authority had struck down as unconstitutional, adding a few restrictions that still fall short of judicial oversight.

In separate reports in February, Human Rights Watch and Amnesty International documented more than three dozen cases in which the use of these emergency powers violated universal rights to liberty, privacy, or freedoms of movement, association and expression. The two groups also found that the emergency acts lost suspects jobs, traumatized children, and damaged homes. The vast majority of those targeted were Muslims. Those interviewed said the actions left them feeling stigmatized and eroded their trust in the French authorities. The latest version of the emergency law risks compounding these effects.

The decisions to advance unconstitutional and discriminatory ‘security’ laws and policies following serious crimes threaten to undermine democracies while potentially strengthening states. But worryingly there are fewer and fewer loud voices for the rough and tumble consequences of maintaining a democratic form of governance as opposed to those who assert that a powerful state apparatus is needed if normalcy is to exist. The result may be the sleepwalking from governments for and by the people, to those that protect citizen-serfs and harshly discriminate against difference.

Link

For some safety experts, Uber’s self-driving taxi test isn’t something to hail

Washington Post:

Even so, the effort is raising concern from safety experts who say the technology has major limitations that can be very dangerous. Self-driving cars have trouble seeing in bad weather. Sudden downpours, snow and especially puddles make it difficult for autonomous vehicles to detect lines on pavement and thereby stay in one lane.

Walker Smith added that self-driving cars have sometimes confused bridges for other obstacles. “People need to understand both the potential and the limitations of these systems, and inviting them inside is part of that education,” he said.

The vehicles also have difficulty understanding human gestures — for example, a crosswalk guard in front of a local elementary school may not be understood, said Mary Cummings, director of Duke University’s Humans and Autonomy Lab, at a Senate hearing in March. She recommended that the vehicles not be allowed to operate near schools.

Then there’s a the human factor: Researchers have shown that people like to test and prank robots. Today, a GPS jammer, which some people keep in their trunks to block police from tracking them, will easily throw off a self-driving car’s ability to sense where it is, Cummings said.

Current self-driving cars often cannot see which lane they’re in, if it’s raining. They don’t understand what a bridge is versus other road-terrain. They don’t understand what a cross-walk guard is. And they are reliant on a notoriously brittle location technology.

What can go wrong with testing them in urban centres then, exactly?