Following a massive fuss about insensitive 'joke' pages on Facebook, Zuckerberg's gang has given in to the pressure and removed groups that make light of rape.
Initially, Facebook refused to down the pages. It said in a statement that the poor-taste jokes were akin to trying to get laughs from being offensive down at the pub. "Just as telling a rude joke won't get you thrown out of your local pub, it won't get you thrown off Facebook," it said in a statement.
However, this irked victim support groups and women's rights advocates, particularly because among the pages were "You know she's playing hard to get when you're chasing her down an alleyway". Critics argued that far from being a joke, the pages encouraged an atmosphere of victimisation and making light of a very serious and traumatising crime - with some commentators suggesting they were even "pro-Rape".
Now Facebook says that "there is no place on Facebook for content that is hateful, threatening, or incites violence." A spokesperson told the BBC "we take reports of questionable and offensive content very seriously".
The spokesperson continued: "However, we also want Facebook to be a place where people can openly discuss issues and express their views, while respecting the rights and feelings of others."
It's a difficult subject. The playground-pages of offensive (and offensively unfunny) Facebook jokes catered to the immature sense of humour most commonly found between years 8 - 10 in the UK's high schools. There are two sides to the story. While the pages are undoubtedly offensive, and intended as such, it's another victory for censorship.
Facebook has introduced new administrator tags so page owners can clearly mark their groups as jokes. That means the pages are more likely to stay online, as long as they're tagged as humorous or satire.
The bigger question at hand isn't why Facebook was dilly-dallying over removing the bad jokes of a 14-year-old, 'liked' mostly by other 14 year olds. Frankly, it's unlikely Facebook was harbouring pro-rape gangs.
The question should be Facebook's power over censorship in the first place.
Facebook-watchers will remember the Boycott BP group last year. The page quickly gained hundreds of thousands of members, who started disappearing almost at random through account deactivations. Eventually the page disappeared altogether. Later, Facebook claimed it had deleted the 800,000 strong page "in error".
Facebook's removal of pages isn't written in their terms and conditions or made of any moral code.
Its policy is based on politics, profit and private interest.
Even in the run up to the UK's last general election, there were reports of Facebook killing anti-Conservative pages because they "breached" T&Cs.
Really, it could be argued the rubbish and offensive rape 'jokes' on Facebook are a problem with a society that teaches women not to get raped rather than teaching men not to rape. Not Facebook's problem.
What is something Facebook should make transparent is its policy regarding its wide-reaching terms & conditions, and why it seems to selectively exercise its might in shutting down single pages without explanation.