Image

Facebook Censorship

I’ve tried to think of something comprehensive to say about the Facebook censorship rules for a few days now. I still don’t have something that really captures how absurd and offensive many of the items listed are. So, rather than give a holistic analysis of the document, here are a few thoughts:

Sex and Nudity

  • Point (1) indicates that permitting foreplay images between members of the same gender is somehow exception, given the statement “Foreplay allowed (Kissing, groping, etc.) even for same sex (man-man/woman-woman.” That this needs to be clearly stated is suggestive of a basic level of discomfort with same sex relationships.
  • Point (12) seems intensely hard to police, with enforcement being contingent on an employee’s own awareness of sexual fetishes. Moreover, given that the definition of a fetish is often derived from the use of inanimate objects as a stimulus to achieve sexual enjoyment/arousal, a high level of subjectivity will almost necessarily come into monitoring for the depiction of sexual fetishes “in any form.”

Hate Content

  • The note that “Humor overrules hate speech UNLESS slur words are present or the humor is not evident” is concerning because, in some circumstances, Facebook recognizes hate speech as somehow appropriate. I would suggest that the capacity for one person to detect humour is a particularly poor (and, arguably, inappropriate) evaluation metric.

Graphic Content

  • Point (1) seems immediately hard to govern, especially given that many Facebook members will support state-sanction violence towards targeted individuals. Example: would graphic comments supporting American efforts to torture Osama bin Laden be inappropriate? Is it OK to call for violence towards ‘bad’ people and not towards ‘good’ ones?
  • Point (6) prohibits the exhibition of what might be termed ‘grisly’ images that clearly show the penetration of skin. Blood or other aspects of a violent act are permitted, but the barrier of the skin is seen as special. This is suggestive of the ‘kinds’ of violence that Facebook recognizes as more or less appropriate for public viewing while imposing a particular cultural norm on a global network.
  • There is “No exception for news or awareness related content.” Thus, any news that is shared by Facebook members must conform to a specific norm of ‘appropriateness’ and failure to conform results in the removal of the content. Such an attitude speaks poorly of the company’s willingness to act as a site for individuals to communicate fully and openly: Facebook is declaring that their monetization depends, in part, on everyone being happy (or at least not shocked) and thus prohibits certain modes of expression.

Credible Threats

  • Point (3), that any threat to a head of state should be escalated, regardless of credibility, is problematic for three reasons. First: it will capture a vast number of users in a dragnet and it is unclear just little would place a user within this net (e.g. would “I fucking hate X and wish we’d just kill X” qualify?) Second: it stinks of an effort to pass responsibility to another party, so that if a particular message is ever linked to an attack then Facebook would be minimally responsible. Third: the number of potential threats can outpace professional security audit staff’s capability to ascertain real/false threats. Dragnet surveillance for this kind of behaviour is a poor means of identifying actual threats.

Those are some of my thoughts about this particular document. There are others that are still crystallizing and once/if I develop a full thought about the document I’ll be sure to post it.