Meta

Report March 2025

Submitted

Executive summary

We are pleased to share our fifth report under the 2022 EU Code of Practice on Disinformation, which also draws from our work with the Code’s Taskforce.

The aim of this report is to provide the latest updates, for July to December 2024, on how Meta approaches misinformation and disinformation in the European Union. We have additionally included any pertinent updates which occurred after the reporting period, where relevant in the report. Highlights include: 

  • Elections: We have aligned this report with Meta’s post-elections report which covers the Legislative Elections in France . We also included information about the Presidential and Parliamentary Elections in Romania in the National Elections chapter, which provides an overview of our work, including information on our core policies, processes, and implementation. 

  • Media literacy

    • National Elections: In preparation for the French legislative elections, Meta invested in media literacy by launching a campaign on its platforms, Facebook and Instagram. This initiative aimed to raise awareness among French users about the tools and processes Meta employs to combat misinformation, prevent electoral interference, and protect electoral candidates. Running from 20 June to 4 July 2024, just before the second round of elections, the campaign reached 2.1 million users in France and generated 10.6 million impressions. Additionally, Meta collaborated with the European Fact-Checking Standards Network (EFCSN) and the European Disability Forum (EDF) to educate users on identifying AI-generated and digitally altered media.

    • Fraud and Scams: Meta launched a campaign to raise awareness of fraud and scams. The campaign ran in several EU markets, including France, Germany, Poland, Romania, Belgium, and Spain and used a range of relevant mediums including Meta’s platforms (Facebook and Instagram),  and other third-party platforms. The campaign featured ads from Facebook, Instagram, and WhatsApp, emphasizing our commitment to user safety.

  • CIB trends and Doppelganger: As a result of our ongoing aggressive enforcement against recidivist efforts by Doppelganger, its operators have been forced to keep adapting and making tactical changes in an attempt to evade takedowns, as indicated in our Quarterly Adversarial Threat report for Q3 2024. These changes have led to the degradation of the quality of the operation’s efforts.

  • Researcher data access: As part of our ongoing efforts to enhance the Meta Content Library tool and incorporate feedback from researchers, we've made searching more efficient by adding exact phrase matching, and researchers can now share editable content producer lists with their peers, enabling quick filtering of public data from specific content producers on Facebook and Instagram.


  • Labelling AI generated images for increased transparency: In H2 2024, we rolled out a change to the “AI info” labels on our platforms so they better reflect the extent of AI used in content. Our intent is to help people know when they see content that was made with AI, and we continue to work with companies across the industry to improve our labeling process so that labels on our platforms are more in line with peoples’ expectations.


Here are a few of the figures which can be found throughout the report:

  • From 01/07/2024 to 31/12/2024, we removed over 5.1 million ads from Facebook and Instagram in EU member states, of which over 87,000 ads were removed from Facebook and Instagram for violating our misinformation policy.

  • From 01/07/2024 to 31/12/2024, we labelled over 810,000 ads on both Facebook and Instagram with “paid for by” disclaimers in the EU.

  • We removed 2 networks in Q3 2024 and 1 network in Q4 2024 for violating our Coordinated Inauthentic Behaviour (CIB) policy which targeted one or more European countries (effectively or potentially). We also took steps to remove fake accounts, prioritising the removal of fake accounts that seek to cause harm. In Q3, we took action against 1.1 billion fake accounts and in Q4 2024, we took action against 1.4 billion fake accounts on Facebook globally. We estimate that fake accounts represented approximately 3% of our worldwide monthly active users (MAU) on Facebook during Q3 2024 and 3% during Q4 2024. 

  • In July-December 2024, we worked through our global fact-checking programme, so that our independent fact-checking partners could continue to quickly review and rate false content on our apps. We‘ve partnered with 29 fact-checking organisations covering 23 different languages in the EU. On average 46% of people on Instagram and 47% of people on Facebook in the EU who start to share fact-checked content do not complete this action after receiving a warning that the content has been fact-checked. 

  • Between 01/07/2024 to 31/12/2024, over 150,000 distinct fact-checking articles on Facebook in the EU were used to both label and reduce the virality of over 27 million pieces of content in the EU. As for Instagram, over 43,000 distinct articles in the EU were used to both label and reduce the virality of over 1 million pieces of content in the EU. 


As currently drafted, this report addresses the practices implemented for Facebook, Instagram, Messenger, and WhatsApp within the EU during the reporting period of H2 2024. In alignment with Meta's public announcements on 7 January 2025, we will continue to evaluate the applicability of these practices to Meta products. We will also regularly review the appropriateness of making adjustments in response to changes in our practices, such as the deployment of Community Notes.


Download PDF

Elections 2024
[Note: Signatories are requested to provide information relevant to their particular response to the threats and challenges they observed on their service(s). They ensure that the information below provides an accurate and complete report of their relevant actions. As operational responses to crisis/election situations can vary from service to service, an absence of information should not be considered a priori a shortfall in the way a particular service has responded. Impact metrics are accurate to the best of signatories’ abilities to measure them].
Threats observed or anticipated

Over many years, Meta has developed a comprehensive approach for elections on its platforms. While each election is unique, we have used our experience working on more than 200 elections around the world to build a robust election program that includes mature processes, tools, and policies to protect speech on our platform and safeguard the integrity of the elections. We continuously improve these measures to make sure they remain responsive to risks as they emerge, and we have reinforced these efforts in light of the regulatory framework set out under the Digital Services Act, the Election Guidelines, and our commitments under this Code.


We outlined our comprehensive approach for elections, and its particular relevance to the 2024 European Parliament (“EP”) elections, in our public post-elections report for the EP elections available on our Transparency Center.  This work continued in earnest for the snap legislative elections in France, which were called on 9 June 2024 following the results of the EP elections, and which occurred shortly thereafter. Additionally, similar efforts were made for the Presidential and Parliamentary elections in Romania, held on 24 November 2024, and 1 December 2024, respectively.


Our comprehensive approach for elections was outlined in our public post-elections report for the EP elections available on our Transparency Center. Meta’s approach to elections is outlined in full across the following pillars:

  1. Utilising and deploying our policies, and our overall content moderation efforts, to remove policy-violating content and help keep people safe on our platforms
  2. Our election risk management processes
  3. Cooperation with external stakeholders
  4. Tools to support civic engagement
  5. Preventing interference and disinformation
  6. Reducing the spread of misinformation
  7. Safeguards and transparency efforts related to political advertising
  8. Responsible approach to Generative AI



This work continued in earnest for the European National elections, including snap legislative elections in France. Below we provide a summarised overview of support for the legislative elections in France and the impact of our efforts during this period, with the focus on 2 key aspects, which are:

  • Cooperation with external stakeholders in advance of the elections:
    • Working Group on Elections & Rapid Response System
    • Engagement with national authorities

  • Our work in the Generative AI space

Mitigations in place

Cooperation with External Stakeholders

Meta engages with a full range of external stakeholders to inform our processes and procedures as part of day-to-day business, and this practice continued during our election preparation. Meta values the networks and channels we have with our external stakeholders to work together in identifying risks on our platforms, and as such, we have welcomed many of the Election Guidelines recommending cooperation and points of contact with national authorities, civil society organisations, and others.


France: Pre-Election Engagements with National Authorities and Civil Society:
As part of the Working Group, Meta participated in the various sessions organised ahead of the legislative elections in France to discuss election readiness with the signatories of the EU CoP on Disinformation, including fact checkers and civil society organisations. In these engagements, along with other signatory platforms, we presented the efforts and tools we were deploying to fight against misinformation and foreign interference, and to provide more transparency on political ads. In addition, we shared information on our civic products aimed at informing users. Meta also responded to questions from the different participants on escalation channels and approaches.

Digital Service Coordinator (“DSC”) - Arcom:
Meta conducted outreach and delivered comprehensive training to Autorité de régulation de la communication audiovisuelle et numérique (Arcom), as France’s appointed DSC. Arcom, as well as other onboarded DSCs, have access to Meta’s government reporting channels. 

We provided step-by-step guidance to help Arcom navigate the “Single Point of Contact” (SPOC) Form for EU Member States’ authorities, the EU Commission, and the EU Board for Digital Services, as well as the onboarding process, where required, in order to access the relevant contact forms. During the electoral period, we received no reports from Arcom through this dedicated reporting channel. 

We have a long-standing relationship with Arcom and are in regular touch on various topics. We maintained continuous communication and engagement ahead of the EP elections through regular check-ins on election preparedness. In addition, we joined the industry roundtable hosted by Arcom on 2 May 2024 in their headquarters, along with VIGINUM (France’s agency in charge of tackling online foreign interference) and other tech platforms to present our work on election integrity, with a particular focus on misinformation and foreign interference. 

Meta also participated in a roundtable co-organised by Arcom, the European Commission, and VIGINUM on 24 June 2024 ahead of the election, bringing together industry partners to discuss elections preparations and mitigations to address systemic risks around the French snap elections. Meta continued direct engagements with Arcom throughout the electoral period.

VIGINUM
In addition to our engagements with VIGINUM at the roundtables discussed above, we held an engagement with them on 21 May 2024 to discuss our investments to prevent foreign interference, protect the elections, and establish the appropriate communication channels between our teams to ensure we could identify and tackle potential operations efficiently.

Political Parties:
Ahead of the EP elections, Meta organised training sessions and office hours on our policies and products with French government organisations, political parties, and civil society organisations. Political parties were provided an email alias to contact for any urgent escalations around the election. We additionally launched an EU Election Center (https://www.facebook.com/government-nonprofits/eu) in all 24 EU official languages, including French, to support our government partners. For the legislative elections in France, these same resources were available and further office hours were offered to ensure provision of best practices and support.


Romania
As part of the elections preparations efforts, Meta has engaged with a full range of Romanian stakeholders to inform our processes and procedures and hear their concerns. Engagements with government and non government partners started ahead of the 2024 EP Elections, and continue at this point in time. 

  • Romanian government stakeholders: We are in regular contact with the AnCOM (Romanian Digital Service Coordinator), the Ministry of Digitalisation, the Electoral Body and the Romanian Cybersecurity agency on elections related topics . All of them are onboarded to our direct escalation channels, where they have been reporting content to us.
  • Election Engagements with the European Commission, National Authorities and Civil Society: Similar to what we did in France, Meta participated in the various sessions organised ahead and after the 2024 elections to discuss election readiness with the signatories of the EU CoP on Disinformation, including fact checkers and civil society organisations. In these engagements, along with other signatory platforms, we presented the efforts and tools we were deploying to fight against misinformation and foreign interference, and to provide more transparency on political ads. In addition, we shared information on our civic products aimed at informing users. Meta also responded to questions from the different participants on escalation channels and approaches. 


Working Group on Elections & Rapid Response System
:

Meta is also an active member of the EU Code of Practice (“CoP”) on Disinformation Taskforce’s Working Group on Elections and took part in its Rapid Response System. This was first piloted for the European Parliamentary elections and the CoP Taskforce decided to have it in place for the legislative elections in France as well.

France
As part of the Working Group, Meta participated in the various sessions organised ahead of the legislative elections in France to discuss election readiness with the signatories of the EU CoP on Disinformation, including fact checkers and civil society organisations. In these engagements, along with other signatory platforms, we presented the efforts and tools we were deploying to fight against misinformation and foreign interference, and to provide more transparency on political ads. Meta also responded to questions from the different participants on escalation channels and approaches.


Romania
Rapid Alert System:
Meta participated in the Rapid Alert system and has been in regular touch with civil society organisations from Romania, through various meetings and roundtables organised by the Disinfo working group. Meta created a direct escalation channel for five Romanian partners to report Community Standards violations,  and unlawful content.  


Political parties
: Meta started engaging with Romanian Political Parties already in advance to the European Parliamentary Elections. Ahead of the 2024 Presidential and Parliamentary elections, Meta organised online training sessions on our policies and products and provided, and on how to contact Meta in case of an escalation.   



Responsible Approach to Gen AI


Meta’s approach to responsible AI is another way that we are safeguarding the integrity of elections globally, including for the EU national elections.

Community Standards, Fact-Checking, and AI Labelling:


Meta’s Community Standards and Advertising Standards apply to all content, including content generated by AI. AI-generated content is also eligible to be reviewed and rated by Meta’s third-party fact-checking partners, whose rating options allow them to address various ways in which media content may mislead people, including but not limited to media that is created or edited by AI. 


Meta labels photorealistic images created using Meta AI, as well as AI-generated images from Google, OpenAI, Microsoft, Adobe, Midjourney, and Shutterstock that users post to Facebook and Instagram.


Meta has begun labelling a wider range of video, audio, and image content when we detect industry-standard AI image indicators or when users disclose that they’re uploading AI-generated content. Meta requires people to use this disclosure and label tool when they post organic content with a photorealistic video or realistic-sounding audio that was digitally created or altered, and may apply penalties if they fail to do so. If Meta determines that digitally created or altered image, video, or audio content creates a particularly high risk of materially deceiving the public on a matter of importance, we may add a more prominent label, so that people have more information and context.


Political Ads and Meta’s AI Disclosure Policy:

Meta announced in November 2023 a disclosure policy to help people understand when a SIEP ad (as described in Section 6) on Facebook or Instagram has been digitally created or altered, including through the use of AI. This policy went into effect in January 2024 and was active during the legislative elections in France. 

Advertisers have to disclose whenever a SIEP ad contains a photorealistic image or video, or realistic sounding audio, that was digitally created or altered to:

  • Depict a real person as saying or doing something they did not say or do; or
  • Depict a realistic-looking person that does not exist or a realistic-looking event that did not happen, or alter footage of a real event that happened; or
  • Depict a realistic event that allegedly occurred, but that is not a true image, video or audio recording of the event.


If advertisers do not disclose these specified scenarios, the ad may be disapproved. Repeated failure to disclose may result in further penalties to the account.
AI Content Around the French Elections:
As a result of our policies and measures relating to AI-generated content, between 1 June - 21 July 2024, over 50 SIEP ads created by users in France across Facebook and Instagram were labelled with the “digitally created” AI disclaimer as a result of self-disclosure, providing enhanced transparency to users.
SIEP Ads and Enforcement Around the French Elections:
The below table shows information on the number of ads accepted and run with SIEP disclaimers as well as the number of ads removed for non-compliance with Meta’s SIEP policy between 1 June - 21 July 2024, where the inferred advertiser location at the time of enforcement was France. This reflects application of the above mentioned policies and measures.

Number of SIEP ads accepted & labelled on Facebook and Instagram combined | Over 10,000
Number of SIEP ads removed for not complying with our SIEP ads policy on Facebook and Instagram combined | Over 20,000

Continuing to Foster AI Transparency through Industry Collaboration:

Meta has also been working with other companies in the tech industry on common standards and guidelines. Meta Platforms, Inc. is a member of the Partnership on AI, for example, and signed onto the tech accord designed to combat the spread of deceptive AI content in 2024 elections globally. Meta receives information from Meta Platforms, Inc. in the progress of these initiatives, and benefits from these partnerships when addressing the risks of manipulated media. 
Policies and Terms and Conditions
All the measures outlined in this report are in place ahead of the European Parliament elections, as well as national elections. In addition, we have the policy change outlined below.

Policy
Prohibited Ads Policy

Changes (such as newly introduced policies, edits, adaptation in scope or implementation)
We've established measures in which ads related to voting around elections (this includes primary, general, special, and run-off elections) are subject to additional prohibitions and will be rejected if in violation of our policies. This policy applies to the Member States of the EU. 

Rationale

Ads targeting the EU with the following content aren't allowed:

  • Ads that discourage people from voting in an election. This includes ads that portray voting as useless/meaningless and/or advise people not to vote.
  • Ads that call into question the legitimacy of an upcoming or ongoing election.
  • Ads with premature claims of election victory.


This prohibition includes ads that call into question the legitimacy of the methods and processes of elections, as well as their outcomes.