Categories
Aside Links

This Mathematician Says Big Data Punishes Poor People

This Mathematician Says Big Data Punishes Poor People:

O’Neil sees plenty of parallels between the usage of Big Data today and the predatory lending practices of the subprime crisis. In both cases, the effects are hard to track, even for insiders. Like the dark financial arts employed in the run up to the 2008 financial crisis, the Big Data algorithms that sort us into piles of “worthy” and “unworthy” are mostly opaque and unregulated, not to mention generated (and used) by large multinational firms with huge lobbying power to keep it that way. “The discriminatory and even predatory way in which algorithms are being used in everything from our school system to the criminal justice system is really a silent financial crisis,” says O’Neil.

The effects are just as pernicious. Using her deep technical understanding of modeling, she shows how the algorithms used to, say, rank teacher performance are based on exactly the sort of shallow and volatile type of data sets that informed those faulty mortgage models in the run up to 2008. Her work makes particularly disturbing points about how being on the wrong side of an algorithmic decision can snowball in incredibly destructive ways—a young black man, for example, who lives in an area targeted by crime fighting algorithms that add more police to his neighborhood because of higher violent crime rates will necessarily be more likely to be targeted for any petty violation, which adds to a digital profile that could subsequently limit his credit, his job prospects, and so on. Yet neighborhoods more likely to commit white collar crime aren’t targeted in this way.

In higher education, the use of algorithmic models that rank colleges has led to an educational arms race where schools offer more and more merit rather than need based aid to students who’ll make their numbers (thus rankings) look better. At the same time, for-profit universities can troll for data on economically or socially vulnerable would be students and find their “pain points,” as a recruiting manual for one for-profit university, Vatterott, describes it, in any number of online questionnaires or surveys they may have unwittingly filled out. The schools can then use this info to funnel ads to welfare mothers, recently divorced and out of work people, those who’ve been incarcerated or even those who’ve suffered injury or a death in the family.

The usage of Big Data to inform all aspects of our lives, with and without our knowledge, matters not just because it dictates the life chances that are presented or denied to us. It also matters because the artificial intelligence systems that are being developed and deployed are learning from the data is collected. And those AI systems, themselves, can be biased and inaccessible to third-party audit.

Corporations are increasingly the substitutes for core state institutions. And as they collect and analyze data in bulk and hide away their methods of presenting data on behalf of states (or in lieu of past state institutions) the public is left vulnerable not just to corporate malice, but disinterest. Worse, this is a kind of disinterest that is difficult to challenge in the absence of laws compelling corporate transparency.

Categories
Humour

You are Free*

From mainstreamrevolution 

 

Categories
Links

An End to Privacy Theater: Exposing and Discouraging Corporate Disclosure of User Data to the Government

You should go read Chris’ paper, available at SSRN. Abstract below:

Today, when consumers evaluate potential telecommunications, Internet service or application providers – they are likely to consider several differentiating factors: The cost of service, the features offered as well as the providers’ reputation for network quality and customer service. The firms’ divergent approaches to privacy, and in particular, their policies regarding law enforcement and intelligence agencies’ access to their customers’ private data are not considered by consumers during the purchasing process – perhaps because it is practically impossible for anyone to discover this information.

A naïve reader might simply assume that the law gives companies very little wiggle room – when they are required to provide data, they must do so. This is true. However, companies have a huge amount of flexibility in the way they design their networks, in the amount of data they retain by default, the exigent circumstances in which they share data without a court order, and the degree to which they fight unreasonable requests. As such, there are substantial differences in the privacy practices of the major players in the telecommunications and Internet applications market: Some firms retain identifying data for years, while others retain no data at all; some voluntarily provide government agencies access to user data – one carrier even argued in court that its 1st amendment free speech rights guarantee it the right to do so, while other companies refuse to voluntarily disclose data without a court order; some companies charge government agencies when they request user data, while others disclose it for free. As such, a consumer’s decision to use a particular carrier or provider can significantly impact their privacy, and in some cases, their freedom.

Many companies profess their commitment to protecting their customers’ privacy, with some even arguing that they compete on their respective privacy practices. However, none seem to be willing to disclose, let alone compete on the extent to which they assist or resist government agencies’ surveillance activities. Because information about each firm’s practices is not publicly known, consumers cannot vote with their dollars, and pick service providers that best protect their privacy.

In this article, I focus on this lack of information and on the policy changes necessary to create market pressure for companies to put their customers’ privacy first. I outline the numerous ways in which companies currently assist the government, often going out of their way to provide easy access to their customers’ private communications and documents. I also highlight several ways in which some companies have opted to protect user privacy, and the specific product design decisions that firms can make that either protect their customers’ private data by default, or make it trivial for the government to engage in large scale surveillance. Finally, I make specific policy recommendations that, if implemented, will lead to the public disclosure of these privacy differences between companies, and hopefully, create further market incentives for firms to embrace privacy by design.