As noted in our previous report, Meta launched an AI disclosure policy in 2024 to help people understand when a social issue, election, or political advertisement on Facebook has been digitally created or altered, including through the use of AI.
Advertisers will have to disclose whenever a social issue, electoral, or political ad contains a photorealistic image or video, or realistic sounding audio, that was digitally created or altered to:
- Depict a real person as saying or doing something they did not say or do; or
- Depict a realistic-looking person that does not exist or a realistic-looking event that did not happen, or alter footage of a real event that happened; or
- Depict a realistic event that allegedly occurred, but that is not a true image, video, or audio recording of the event.