As part of a BBC investigative report on Facebook's ongoing efforts to combat sexual images of children on its platform, the broadcasting agency reported 100 photos to Facebook that would pretty clearly appear to violate the social network's "community standards." These were images of children under 16 in sexual poses alongside obscene comments, images of Facebook groups for pedophiles to share stolen images of children, and so forth. The BBC also found five convicted pedophiles with Facebook profiles, in violation of Facebook's rules that disallow convicted sex offenders from having accounts. After hitting the "report" button to alert the company, Facebook, the BBC claims, removed just 18 of the images. Automated replies from the social network said that the other 82 images didn't breach community standards.

That was cause for alarm at the National Society for the Prevention of Cruelty to Children. "Facebook's failure to remove illegal content from its website is appalling and violates the agreements they have in place to protect children," a spokesperson told the BBC. "It also raises the question of what content they consider to be inappropriate and dangerous to children."

But what happened next might be even more troubling. When BBC reporters asked Facebook for an interview about its moderation systems, the social network's director of policy, Simon Milner, agreed to an interview — but first requested that the BBC provide examples of the material it had reported that wasn't removed by the moderators. The BBC complied... and Facebook reported the BBC to the UK's National Crime Agency.

Here's Angus Crawford, a BBC news correspondent:

"We have carefully reviewed the content referred to us and have now removed all items that were illegal or against our standards," Facebook wrote in a statement to the BBC. "This content is no longer on our platform. We take this matter extremely seriously and we continue to improve our reporting and take-down measures... It is against the law for anyone to distribute images of child exploitation. When the BBC sent us such images we followed our industry's standard practice and reported them to Ceop [Child Exploitation & Online Protection Centre."

“The fact that Facebook sent images that had been sent to them, that appear on their site, for their response about how Facebook deals with inappropriate image... [The] fact that they sent those on to the police seemed to me to be extraordinary..." says BBC director of editorial policy, David Jordan. "One can only assume that the Facebook executives were unwilling or certainly reluctant to engage in an interview or a debate about why these images are available on the Facebook site."

Ironically, Facebook has been criticized for its over-censorship of images in the past: Last September, the company removed a famous Vietnam war image of a naked girl running from a Napalm attack. The photograph, called "The Terror of War" and captured by Pulitzer Prize-winning photographer Nick Ut, was included on Facebook among a list of "seven photographs that changed the history of warfare" created by Norway's largest newspaper, Aftenposten.

"Listen, Mark. This is serious," the papers editor-in-chief wrote in a front page open letter to Facebook CEO Mark Zuckerberg. "First you create rules that don’t distinguish between child pornography and famous war photographs. Then you practice these rules without allowing space for good judgment."

Later, in early January, the Facebook bowdlerized a 450-year-old Italian statue of the god Neptune, naked and holding a trident.

Related: Norwegians' Outcry Over Facebook Censoring An Iconic Vietnam War Photo Leads To Company's Reversal