Cuts in trust and safety team part of switch towards artificial intelligence by social media app
TikTok has put hundreds of UK content moderators’ jobs at risk, even as tighter rules come into effect to stop the spread of harmful material online.
The viral video app said several hundred jobs in its trust and safety team could be affected in the UK, as well as south and south-east Asia, as part of a global reorganisation.
Their work will be reallocated to other European offices and third-party providers, with some trust and safety jobs remaining in the UK, the company said.
It is part of a wider move at TikTok to rely on artificial intelligence for moderation. More than 85% of the content removed for violating its community guidelines is identified and taken down by automation, according to the platform.






