Meta will require advertisers to disclose whether the ads they submit for its websites have been digitally altered, including through the use of AI tools, if they're political or social in nature. Ads that have been digitally altered will be marked as such on Meta's platforms, in the same way some advertisements come with a "Paid for" disclaimer. The company will start implementing the rule in the new year, just as the campaign period for what's expected to be a brutal and divisive 2024 US presidential elections heats up.
In a blog post, Meta explained that advertisers have to disclose in the advertising flow if they submit a social issue, electoral or political ad with photorealistic images or videos — or one with realistic sounding audio — that was altered to make a real person say or do something they didn't actually say or do. They're also required to tell Meta whether they're submitting an ad with a realistic-looking person that doesn't exist, a realistic-looking event that didn't happen or an altered footage of a real event that truly occurred. If they submit a fake image, video or audio recording of an event that allegedly took place — say, something they created with the help of AI image generators — they have to notify Meta, as well. Advertisers don't need to disclose if they'd only size adjusted, cropped, color corrected and sharpened their ads.
