Privacy Policies Don’t Need to Be Obtuse

Peter Fleischer has a good summary piece on the (miserable) state of online privacy policies today. As he writes:

Today, privacy policies are being written to try to do two contradictory things.  Like most things in life, if you try to do two contradictory things at the same time, you end up doing neither well.  Here’s the contradiction:  should a privacy policy be a short, simple, readable notice that the average end-user could understand? Or should it be a long, detailed, legalistic disclosure document written for regulators?  Since average users and expert regulators have different expectations about what should be disclosed, the privacy policies in use today largely disappoint both groups.


The time has come for a global reflection on what, exactly, a privacy policy should look like.  Today, there is no consensus.  I don’t just mean consensus amongst regulators and lawyers.  My suggestion would be to start by doing some serious user-research, and actually ask Johnny and Jean and Johann.

I entirely, fully, wholeheartedly agree: most policies today are absolute garbage. I actually read a lot of them – and research on social media policies will be online and available soon! – and they are more often than not an elaborate act of obfuscation than something that explains, specifically and precisely, what a service does or is doing with the data that is collected.

The thing is, these policies don’t need to be as bad as they are. It really is possible to bridge ‘accessible’ and ‘legalese’ but doing so takes time, care, and effort.

And fewer lawyers.

As a good example of how this can be done check out how Tunnelbear has written their privacy policy: it’s reasonably accessible and lacks a lot of the ‘weasel phrases’ you’ll find in most privacy policies. Even better, read the company’s Terms of Service document; I cannot express how much ‘win’ is captured in their simultaneously legal and layperson disclosure of how and why their service functions as it does.


The 27 regulators, led by France’s CNIL, gave Google three to four months to make changes to its privacy policy — or face “more contentious” action. In a statement on its website today, the CNIL said that four months on from that report Google has failed “to come into compliance” so will now face additional action.

“On 18 February, the European authorities find that Google does not give a precise answer and operational recommendations. Under these circumstances, they are determined to act and pursue their investigations,” the CNIL said in its statement (translated from French with Google Translate).

According to the statement, the European regulators intend to set up a working group, led by CNIL, to “coordinate their enforcement action” against Google — with the working group due to be established before the summer. An action plan for tackling the issue was drawn up at a meeting of the regulators late last month, and will be “submitted for validation” later this month, they added.


The [intelligence] professionals’ task is therefore to keep judgements anchored to what the intelligence actually reveals (or does not reveal) and keep in check any predisposition of policy-makers to pontificate … of trying to make nasty facts go away by the magical process of emitting loud noises in the opposite direction.

* Sir David Omand, “Reflections on Secret Intelligence”

Policy Matters Too

Nadim Kobeissi recently wrote about Do Not Track, and effectively restated the engineering-based reasons why the proposed standard will fail. The standard, generally, would let users set their web browser to ask websites not to deposit tracking cookies on their computers. Specifically, Nadim wrote:

Do Not Track is not only ineffective: it’s dangerous, both to the users it lulls into a false belief of privacy, and towards the implementation of proper privacy engineering practice. Privacy isn’t achieved by asking those who have the power to violate your privacy to politely not do so — and thus sacrifice advertising revenue — it’s achieved by implementing client-side preventative measures. For browsers, these are available in examples such as EFF’s HTTPS Everywhere, Abine’s DoNotTrackMe, AdBlock, and so on. Those are proper measures from an engineering perspective, since they attempt to guard your privacy whether the website you’re visiting likes it or not.

He is writing as an engineer and, from that perspective, he’s not wrong. Unfortunately, as an engineer he’s entirely missing the broader implications of DNT: specifically, it lets users proactively inform a site that they do not give consent to being tracked. This proactive declaration can suddenly activate a whole host of privacy protections that are established under law; individuals don’t necessarily have to have their declarations respected for them to be legally actionable.

Now, will most users have any clue if their positions are being upheld? No, of course not. This is generally true of any number of laws. However, advocates, activists, academic researchers, and lawyers smelling class-action lawsuits will monitor to see if websites are intentionally dismissing users’ choice to refuse being tracked. As successful regulatory/legal challenges are mounted website owners will have to engage in a rational calculus: is the intelligence or monies gained from tracking worth the potential regulatory or legal risk? If initial punishments are high enough then major players may decide that it is economically rational to abide by DNT headers, whereas smaller sites (perhaps with less to lose/less knowledge of DNT) may continue to track regardless of what a browser declares to the web server. If we’re lucky, these large players will include analytics engine providers as well as advertiser networks.

Now, does this mean that DNT will necessarily succeed? No, not at all. The process is absolutely mired in confusion and problems – advertisers are trying to water down what DNT ‘means’, and some browser manufacturers are making things harder by trying to be ‘pro-privacy’ and designing DNT as a default setting for their browsers. Moreover, past efforts to technically demonstrate users’ privacy have failed (e.g. P3P), and chances are good that DNT will fail as well. However, simply because there are technical weaknesses associated with the standard does not mean that the protocol, more broadly, will fail: what is coded into standards can facilitate subsequent legal and regulatory defences of users’ privacy, and these defences may significantly improve users’ privacy online.