Facebook parent Meta (META) says it will require advertisers to disclose when they use AI to alter content in political ads.
According to Meta head of global affairs Nick Clegg, the new policy applies to Facebook and Instagram ads that use AI to make it appear as though a person is “saying or doing something they did not say or do,” create “a realistic-looking person that does not exist or a realistic-looking event that did not happen, or alter footage of a real event that happened.”
Similarly, advertisers will need to reveal if they are using AI to depict “a realistic event that allegedly occurred, but that is not a true image, video, or audio recording of the event.”
Meta will then affix a notice to the ad saying it has been manipulated using the technology. The new measure will go into effect in the new year.
Meta CEO Mark Zuckerberg delivers a speech, as the letters AI for artificial intelligence appear on screen, at the Meta Connect event at the company's headquarters in Menlo Park, California, U.S., September 27, 2023. REUTERS/Carlos Barria (Carlos Barria / reuters)
