We agree that Cloud Computing, the Internet of Things, and Big Data analytics are all trends that may yield remarkable new correlations, insights, and benefits for society at large. While we have no intention of standing in the way of progress, it is essential that privacy practitioners participate in these efforts to shape trends in a way that is truly constructive, enabling both privacy and Big Data analytics to develop, in tandem.
There is a growing understanding that innovation and competitiveness must be approached from a “design-thinking” perspective — namely, viewing the world to overcome constraints in a way that is holistic, interdisciplinary, integrative, creative and innovative. Privacy must also be approached from the same design-thinking perspective. Privacy and data protection should be incorporated into networked data systems and technologies by default, and become integral to organizational priorities, project objectives, design processes, and planning operations. Ideally, privacy and data protection should be embedded into every standard, protocol, and data practice that touches our lives. This will require skilled privacy engineers, computer scientists, software designers and common methodologies that are now being developed, hopefully to usher in an era of Big Privacy.
We must be careful not to naively trust data users, or unnecessarily expose individuals to new harms, unintended consequences, power imbalances and data paternalism. A “trust me” model will simply not suffice. Trust but verify — embed privacy as the default, thereby growing trust and enabling confirmation of trusted practices.
- Ann Cavoukian, Alexander Dix, and Khaled El Emam, “The Unintended Consequences of Privacy Paternalism”
I’m generally sympathetic to the arguments made in this article, though there are a series of concerns I have that are (I hope) largely the result of the authors trying to write an inoffensive article that could be acted on by large organizations. To begin, while I understand that Commissioner Cavoukian has developed her reputation on working with partners as opposed to tending to radically oppose corporations’ behaviours I’m left asking: what constitutes ‘progress’ for herself and her German counterpart, Dr. Dix?
Specifically, Commissioners Cavoukian and Dix assert that they have no intention to stand in the way of progress and (generally) that a more privacy-protective approach means we can enjoy progress and privacy at the same time. But how do the Commissioners ‘spot’ progress? How do they know what to oppose and not oppose? When must, and mustn’t, they stand in the way of a corporation’s practices?
The question of defining progress is tightly linked with my other concern from this quoted part of their article. Specifically, the Commissioners acknowledge that a ‘positive-sum’ approach to privacy and progress requires “skilled privacy engineers, computer scientists, software designers and common methodologies that are now being developed, hopefully to usher in an era of Big Privacy.” That these groups are important is true. But where are the non-engineers, non-software designers, and (presumably) non-lawyers? Social scientists and arts and humanities scholars and graduates can also contribute to sensitizing organizations’ understandings of privacy, of user interests, and the history of certain decisions.
Privacy isn’t something that is only understandable by lawyers or engineers. And, really, it would be better understood and protected if there were more people involved in the discussion. Potential contributors to the debates shouldn’t be excluded simply because they contest or demand definitions of ‘progress’ or come from a non-lawyerly or computer-development background. Rather, they should be welcomed as expanding the debate outside of the contemporary echo chamber of the usually-counted disciplinary actors.