In 2024, Meta will begin requiring advertisers running political or issue ads on its platforms to disclose when their ads are "digitally created or altered" through the use of AI.
Facebook and Instagram ads about elections, politics and social issues will soon require the extra step, which advertisers will handle when they submit new ads.
Advertisers will need to make the disclosures when an ad "contains a photorealistic image or video, or realistic sounding audio" that falls into a handful of categories.
Meta's new rules are designed to rein in deepfakes — digitally manipulated media designed to be misleading. The company will require disclosures on ads that were either created or manipulated to show a person doing or saying something they didn't.
The other special cases requiring disclosures include ads depicting photo-realistic people that don't exist or events that look realistic but never happened (including altered imagery from real life events) and ads showing "realistic event[s] that allegedly occurred" but that are "not a true image, video, or audio recording of the event.






