META ANNOUNCED IN January it would end some content moderation efforts, loosen its rules, and put more emphasis on supporting “free expression.” The shifts resulted in fewer posts being removed from Facebook and Instagram, the company disclosed Thursday in its quarterly Community Standards Enforcement Report. Meta said that its new policies had helped reduce erroneous content removals in the US by half without broadly exposing users to more offensive content than before the changes.
The new report, which was referenced in an update to a January blog post by Meta global affairs chief Joel Kaplan, shows that Meta removed nearly one-third less content on Facebook and Instagram globally for violating its rules from January to March of this year than it did in the previous quarter, or about 1.6 billion items compared to just under 2.4 billion, according to an analysis by WIRED. In the past several quarters, the tech giant’s total quarterly removals had previously risen or stayed flat.
Across Instagram and Facebook, Meta reported removing about 50 percent fewer posts for violating its spam rules, nearly 36 percent less for child endangerment, and almost 29 percent less for hateful conduct. Removals increased in only one major rules category—suicide and self-harm content—out of the 11 that Meta lists.






