A former employee of Meta, the owner of social media platforms Facebook, Instagram, WhatsApp, and Threads, testified as part of a major lawsuit that the tech company had a policy allowing 17 strikes before it suspended accounts engaged in the “trafficking of humans for sex.”

Vaishnavi Jayakumar, former head of safety and well-being for Instagram, also testified that in March 2020, Meta did not have a specific way for people to report child sexual abuse material (CSAM) on Instagram, according to federal court documents filed Friday, Nov. 21, in the Northern District of California.

"It was very surprising to me,” Jayakumar said, adding that she tried to raise this issue “multiple times,” but was told it would require too much work to build.

Jayakumar's concerns heightened when she learned of what she called the "17x" policy at Meta.

“That means that you could incur 16 violations for prostitution and sexual solicitation, and upon the 17thviolation, your account would be suspended," she said, according to the court documents. This "by any measure across the industry, is a very, very high strike threshold.”