Categories
Writing

Google’s ‘Friendly Tracking’: Fitfully Creepy?

Kashmir Hill wrote an article last week about how Google Now is informing some Nexus owners of how active they have been over the past week. She rightfully notes that this is really just making transparent the tracking that smartphones do all the time, though putting it to (arguably) good and helpful use. This said, Google’s actions raise a series of interesting issues and questions.

To begin, Google’s actions are putting a ‘friendly face’ on locational tracking. Their presentation of this data also reveals some of the ways that Google can – and apparently is – using locational data: for calculating not just distance but, based on the rate of movement between locations, the means by which users are getting from point A to B. This isn’t surprising,given that Google has had to develop algorithms to determine if subscribers’ phones are moving in cars (in fast or slow traffic) for some of their traffic alerts systems. Determining whether you’re walking/biking instead of driving is presumably just a happy outcome of that algorithmic determination. That said: is this mode of analyzing movement and location necessarily something that users want Google to be processing? Can they have been genuinely expected to consent to this surveillance – barring in jargon-ridden Terms of Service and Privacy Policies – and, moreover, can Now users get both raw data and the categories into which their locational data has been ‘sorted’ by Google? Can they have both sets of data fully, and permanently, expunged from Google databases?

Friendliness – or not, if you see this mode of tracking and notification as problematic – aside, I think that Google’s alerts speak to the important role that ambient technology can play in encouraging public fitness. In the interests of disclosure, I’ve used a non-GPS-based system to track the relative levels of my activity for the past six or seven months. It’s been the single best $100 that I’ve spent in the past five years and led to very important, and positive, changes in my personal health. I specifically chose a non-GPS system because I worry about the implications of linking health/fitness information with where individuals physically move: I see such data as a potential gold mine for health insurers and employers. This is where I see the primary (from my perspective) concerns: how can individuals be assured that GPS-related fitness information won’t be made available to health insurers who are setting Android users’ health premiums? How can they prevent the information from leaking to employers, or anyone else that might have an interest in this data?

Past this issue of data flow control I actually think that making basic fitness information very, very clear to people is a good idea. A comfortable one? No, not necessarily. No one really wants to see how little they may have been active. But I’m not certain that this mode of fitness analysis is necessarily creepy; it can definitely be unpleasant, however.

Of course individuals need to be able to opt-out of this kind of tracking if they’d like. Really, it should be opt-in (from a privacy perspective) though from a public health perspective I can’t help but wonder if it shouldn’t be opt-out. This is an area where there are competing public goods, and unlike a debate around security and privacy (which tends to feature pretty drawn out, well entrenched, battle lines) I’m not sure we’ve had a good discussion about the nature of locational tracking as it relates to basic facets of public fitness and, by extension, public health.

In the end, this is actually a tracking technology that I’m largely on the fence about, and my core reason for having problems with it are (a) I don’t think people had any real idea that they had opted-in to the fitness analysis; (b) I don’t trust third-parties not to get access to this data for purposes at odds with the data subject’s own interests. If both (a) and (b) could be resolved, however, I think I’d have a much harder time disagreeing with such ‘fitness alerts’ being integrated with smartphones given the significant problems of obesity amongst Western citizens.

What are your thoughts on this topic?

Categories
Links Writing

App Developers Face Fines for Lacking Privacy Policies

To be clear and up front: privacy policies suck. I’m currently analyzing the policies of major social networks and if the policies were merely horrific then they’d be massively better than they actually are today.

That said, a privacy policy at least indicates that an organization took the time to copy someone else’s policy. For the briefest of moments there was some (however marginal) contemplation about how the organization’s actions related to privacy. While most companies will just hire a lawyer to slap legalese on their websites, a few will actually think about their data collection and its implications for individuals’ privacy. That’s really all you can hope for privacy policies to generally accomplish unless the company out-and-out lies in their policy. If they do lie then you can get the FTC involved.

The potential for ‘enjoying’ a $2,500 fine per download if a company lacks a privacy policy is a massive stick and, hopefully, will get developers to at least consider how their collection of data implicates users’ privacy. The California approach is not the solution to the problem of people’s data being collected without their genuine consent but at least it’s a start.

Categories
Writing

I need to create responses to the above security questions before I can purchase items through Apple’s digital stores. The problem: I actually don’t know the (legitimate/real) answers to any of the questions.

Admittedly the best security procedure, in the face of any vendor authentication questions, is to produce garbage/unrelated responses to any authentication questions that vendors ask. This said, it’s a a bit insane that I have to do this for the questions Apple has provided. Now, is this a problem that most people can overcome? Of course. They just write in answers and (somewhere) they write down their responses. I actually could use 1Password for this, a terrific password and identity manager that I highly recommend. This said, I’m not going to bother. Purchasing the $20 piece of software just isn’t worth the effort for me: in effect, Apple has succeeded in dissuading me from making an impulse purchase. That’s really not great for the business of app developers (Apple, really, doesn’t care that much given the relative amount that the app store contributes to their overall yearly profits).

You might wonder why these questions are being asked. I suspect they’re largely in response to the Mat Honan hack. In short, a Wired reporter’s Apple, Amazon, Twitter, and Google accounts were hacked so a third-party could masquerade as Mat on Twitter. This led to a ridiculous level of criticism in the press concerning how Apple authenticated users’ identities. I have no doubt that these questions – again, pictured above – are largely meant to better authenticate users and thus avoid identity fraud.

The problem of authentication fraud can be devilishly hard for companies to address. In the case of Apple, there is no option for the user to generate their own questions and responses. This might be seen as good security amongst ‘professionals’ – it prevents really, really crappy questions and easily found responses – but it creates an incredibly poor user experience. While writing down passwords isn’t the horrific nightmare scenario that some security analysts declare, expecting people to find those responses when they’re in trouble – such as their accounts have been hacked – will meet mixed results at best. Further, given how other companies tend to follow Apple’s lead(s) it’s only a matter of time until more and more (less security conscious) companies adopt similar or identical security questions/answers. Such adoptions will limit the relative novelty of Apple’s authentication questions and thus reduce their capability to genuinely authenticate users’ identities. Consequently, such questions (in the short and long terms) will likely just leave its customers frustrated.

Ultimately, this kind of authentication really is less than ideal; more nuanced and (to the user) transparent analytics protocols to detect aberrant behaviours and then recover accounts would be far, far superior to what Apple is presently rolling out. Hopefully it doesn’t take further authentication failures, on Apple’s part, for them to realize the error of their ways and correct it.

Categories
Links

Dispelling Some Mistruths Surrounding Lawful Access

David Fraser has a terrific breakdown of the Canadian Association of Chiefs of Police’s recent argument for lawful access legislation. If you’re Canadian you should definitely check out what he has to say.

Categories
Links Writing

Question to SCOTUS: Can we even bring legal action over warrantless spying?

The EFF continues it’s long slog to challenge the US government’s warrantless wiretapping. At this point a series of cases have been dismissed, though the Supreme Court is now hearing a case to ascertain whether those who have been affected by the dragnet surveillance – lawyers, journalists, human rights lawyers – can challenge the statute given that it “prevents them from doing their job without taking substantial measures when communicating to overseas witnesses, sources and clients.”

This is an incredibly serious case. The outcome will not decide the legality of the statute itself but just whether it can be challenged. By anyone. A dismissal of the case – that is, a decision declaring that no one clearly has standing to challenge the statute – would prevent the existing intelligence operations from ever being challenged so long as the government avoids bringing warrantlessly-accessed data into a trial as evidence.

Watch this case; if it goes sideways then the American government will have (effectively) been given license by the highest court in the land to surveil Americans, without warrant, and without an effective means to prevent the surveillance.

Categories
Quotations

2012.10.30

It’s very complicated. It’s very cumbersome. There’s a lot of numbers involved with it.

Gov. Nikki Haley’s reason for why social security numbers stolen by a hacker weren’t encrypted
Categories
Links

While at first blush Lincoln Alexander has little to do with technology, the words that we exchanged when I received my first degree from Guelph continue to shape my engagement with technology. He also, in just a few sentences, gave me some of the best professional advice I’ve ever received in my life. Though our exchange at convocation wasn’t anywhere close to my first time speaking with Lincoln, nor would it be the last, it was the deepest and most significant. Alastair’s ‘goodbye’ captures my thoughts about Lincoln in as sincere a way as I’ve ever seen; I highly recommend watching Alastair’s address.

Categories
Videos

The many UI nightmares associated with Windows 8

Categories
Links

iMessage and ‘Secure’ Communications

Matthew Green has a good piece that discusses some of the security concerns around iMessage. Specifically he speaks to how, despite Apple’s assurances that it employs “secure end-to-end encryption,” the company still hasn’t properly explained how its encryption processes are established or deployed. Green does a good job explaining these concerns for a very non-technical audience. Highly recommended, especially if you happen to be using iMessage.

Categories
Links

When It Comes to Human Rights, There Are No Online Security Shortcuts

Patrick Ball has a good and highly accessible article over on Wired about why certain means of securing communications are problematic. It’s highly recommended. Rather than leave you with the overview of “this is what is said and why it’s important,” let me leave you with a key quotation from the article that (to my mind) nicely speaks to the author’s general mindset: “Good security is about not trusting people. It’s about studying math and software and assuring that the program cannot be turned to bad intent.”