Dr. Pentland, an academic adviser to the World Economic Forum’s initiatives on Big Data and personal data, agrees that limitations on data collection still make sense, as long as they are flexible and not a “sledgehammer that risks damaging the public good.”
He is leading a group at the M.I.T. Media Lab that is at the forefront of a number of personal data and privacy programs and real-world experiments. He espouses what he calls “a new deal on data” with three basic tenets: you have the right to possess your data, to control how it is used, and to destroy or distribute it as you see fit.
Personal data, Dr. Pentland says, is like modern money — digital packets that move around the planet, traveling rapidly but needing to be controlled. “You give it to a bank, but there’s only so many things the bank can do with it,” he says.
His M.I.T. group is developing tools for controlling, storing and auditing flows of personal data. Its data store is an open-source version, called openPDS. In theory, this kind of technology would undermine the role of data brokers and, perhaps, mitigate privacy risks. In the search for a deep fat fryer, for example, an audit trail should detect unauthorized use.
- Steve Lohr, “Big Data Is Opening Doors, but Maybe Too Many”
So, I don’t really get how Pentland’s system is going to work any better than the Platform for Privacy Preferences (P3P) work that was done a decade ago. Spoiler alert: P3P failed. Hard. And it was intended to simultaneously enhance users’ privacy online (by letting them establish controls on how their personal information was accessed and used) whilst simultaneously giving industry something to point to, in order to avoid federal regulation.
There is a prevalent strain of liberalism that assumes that individuals, when empowered, are best suited to control the dissemination of their personal information. However, it assumes that knowledge, time, and resourcing are equal amongst all parties. This clearly isn’t the case, nor is it the case that individuals are going to be able to learn when advertisers and data miners don’t respect privacy settings. In effect: control does not necessarily equal knowledge, nor does it necessarily equal capacity to act given individuals’ often limited fiscal, educational, temporal, or other resources.