Two court cases have shown how companies can be forced to take responsibility for their impact on public health

D

ebate about online harms has tended to focus on abusive and hateful content. But the form in which content is delivered is at least as important. That point is central to this week’s momentous decisions against Meta and YouTube, by two US juries. It will take more than these cases to loosen big tech’s tight grip on much of the world’s attention. But the fact that both companies were found liable in California, for deliberately designing addictive products that harmed a child, is a massive win for the coalition of campaigners aiming to use the US courts to force the platforms to change their products.

The second case against Meta, in New Mexico, found it liable over the use of Facebook and Instagram for child sex trafficking, with a Guardian investigation cited in the complaint. The jury ordered it to pay $375m in civil liabilities; the state’s attorney general is seeking platform changes and financial penalties.

Both verdicts are expected to be appealed. But the acceptance by juries of evidence about the damage caused by these businesses, much of it derived from internal documents, reveals shifting attitudes. Documents exposing executives’ shockingly cavalier approach to young people’s safety are now in the public square, and will help the industry’s critics in future. One email from a Meta employee said “targetting [sic] 11 year olds feels like tobacco companies a couple decades ago”.