Adding Context to Facebook’s CSAM Reporting

In early 2021, John Buckley, Malia Andrus, and Chris Williams published an article entitled, “Understanding the intentions of Child Sexual Abuse Material (CSAM) sharers” on Meta’s research website. They relied on information that Facebook/Meta had submitted to NCMEC to better understand why individuals they reported had likely shared illegal content.

The issue of CSAM on Facebook’s networks has risen in prominence following a report in 2019 in the New York Times. That piece indicated that Facebook was responsible for reporting the vast majority of the 45 million online photos and videos of children being sexually abused online. Ever since, Facebook has sought to contextualize the information it discloses to NCMEC and explain the efforts it has put in place to prevent CSAM from appearing on its services.

So what was the key finding from the research?

We evaluated 150 accounts that we reported to NCMEC for uploading CSAM in July and August of 2020 and January 2021, and we estimate that more than 75% of these did not exhibit malicious intent (i.e. did not intend to harm a child), but appeared to share for other reasons, such as outrage or poor humor. While this study represents our best understanding, these findings should not be considered a precise measure of the child safety ecosystem.

This finding is significant, as it quickly becomes suggestive that the mass majority of the content reported by Facebook—while illegal!—is not deliberately being shared for malicious purposes. Even if we assume that the number sampled should be adjusted—perhaps only 50% of individuals were malicious—we are still left with a significant finding.

There are, of course, limitations to the research. First, it excludes all end-to-end encrypted messages. So there is some volume of content that cannot be detected using these methods. Second, it remains unclear how scientifically robust it was to choose the selected 150 accounts for analysis. Third, and related, there is a subsequent question of whether the selected accounts are necessarily representative of the broader pool of accounts that are associated with distributing CSAM.

Nevertheless, this seeming sleeper-research hit has significant implications insofar as it would compress the number of problematic accounts/individuals disclosing CSAM to other parties. Clearly more work along this line is required, ideally across Internet platforms, in order to add further context and details to the extent of the CSAM problem and subsequently define what policy solutions are necessary and proportionate.

%d bloggers like this:
search previous next tag category expand menu location phone mail time cart zoom edit close