Meta

Report March 2025

Submitted

Executive summary

We are pleased to share our fifth report under the 2022 EU Code of Practice on Disinformation, which also draws from our work with the Code’s Taskforce.

The aim of this report is to provide the latest updates, for July to December 2024, on how Meta approaches misinformation and disinformation in the European Union. We have additionally included any pertinent updates which occurred after the reporting period, where relevant in the report. Highlights include: 

  • Elections: We have aligned this report with Meta’s post-elections report which covers the Legislative Elections in France . We also included information about the Presidential and Parliamentary Elections in Romania in the National Elections chapter, which provides an overview of our work, including information on our core policies, processes, and implementation. 

  • Media literacy

    • National Elections: In preparation for the French legislative elections, Meta invested in media literacy by launching a campaign on its platforms, Facebook and Instagram. This initiative aimed to raise awareness among French users about the tools and processes Meta employs to combat misinformation, prevent electoral interference, and protect electoral candidates. Running from 20 June to 4 July 2024, just before the second round of elections, the campaign reached 2.1 million users in France and generated 10.6 million impressions. Additionally, Meta collaborated with the European Fact-Checking Standards Network (EFCSN) and the European Disability Forum (EDF) to educate users on identifying AI-generated and digitally altered media.

    • Fraud and Scams: Meta launched a campaign to raise awareness of fraud and scams. The campaign ran in several EU markets, including France, Germany, Poland, Romania, Belgium, and Spain and used a range of relevant mediums including Meta’s platforms (Facebook and Instagram),  and other third-party platforms. The campaign featured ads from Facebook, Instagram, and WhatsApp, emphasizing our commitment to user safety.

  • CIB trends and Doppelganger: As a result of our ongoing aggressive enforcement against recidivist efforts by Doppelganger, its operators have been forced to keep adapting and making tactical changes in an attempt to evade takedowns, as indicated in our Quarterly Adversarial Threat report for Q3 2024. These changes have led to the degradation of the quality of the operation’s efforts.

  • Researcher data access: As part of our ongoing efforts to enhance the Meta Content Library tool and incorporate feedback from researchers, we've made searching more efficient by adding exact phrase matching, and researchers can now share editable content producer lists with their peers, enabling quick filtering of public data from specific content producers on Facebook and Instagram.


  • Labelling AI generated images for increased transparency: In H2 2024, we rolled out a change to the “AI info” labels on our platforms so they better reflect the extent of AI used in content. Our intent is to help people know when they see content that was made with AI, and we continue to work with companies across the industry to improve our labeling process so that labels on our platforms are more in line with peoples’ expectations.


Here are a few of the figures which can be found throughout the report:

  • From 01/07/2024 to 31/12/2024, we removed over 5.1 million ads from Facebook and Instagram in EU member states, of which over 87,000 ads were removed from Facebook and Instagram for violating our misinformation policy.

  • From 01/07/2024 to 31/12/2024, we labelled over 810,000 ads on both Facebook and Instagram with “paid for by” disclaimers in the EU.

  • We removed 2 networks in Q3 2024 and 1 network in Q4 2024 for violating our Coordinated Inauthentic Behaviour (CIB) policy which targeted one or more European countries (effectively or potentially). We also took steps to remove fake accounts, prioritising the removal of fake accounts that seek to cause harm. In Q3, we took action against 1.1 billion fake accounts and in Q4 2024, we took action against 1.4 billion fake accounts on Facebook globally. We estimate that fake accounts represented approximately 3% of our worldwide monthly active users (MAU) on Facebook during Q3 2024 and 3% during Q4 2024. 

  • In July-December 2024, we worked through our global fact-checking programme, so that our independent fact-checking partners could continue to quickly review and rate false content on our apps. We‘ve partnered with 29 fact-checking organisations covering 23 different languages in the EU. On average 46% of people on Instagram and 47% of people on Facebook in the EU who start to share fact-checked content do not complete this action after receiving a warning that the content has been fact-checked. 

  • Between 01/07/2024 to 31/12/2024, over 150,000 distinct fact-checking articles on Facebook in the EU were used to both label and reduce the virality of over 27 million pieces of content in the EU. As for Instagram, over 43,000 distinct articles in the EU were used to both label and reduce the virality of over 1 million pieces of content in the EU. 


As currently drafted, this report addresses the practices implemented for Facebook, Instagram, Messenger, and WhatsApp within the EU during the reporting period of H2 2024. In alignment with Meta's public announcements on 7 January 2025, we will continue to evaluate the applicability of these practices to Meta products. We will also regularly review the appropriateness of making adjustments in response to changes in our practices, such as the deployment of Community Notes.


Download PDF

Commitment 30
Relevant Signatories commit to establish a framework for transparent, structured, open, financially sustainable, and non-discriminatory cooperation between them and the EU fact-checking community regarding resources and support made available to fact-checkers.
We signed up to the following measures of this commitment
Measure 30.1 Measure 30.2 Measure 30.3 Measure 30.4
In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?
Yes
If yes, list these implementation measures here
In the first half of 2024, Meta provided all third-party fact-checkers (3PFCs) participating in our fact-checking programs with access to the Meta Content Library (MCL). This initiative aimed to enhance the fact-checking workflow and provide users with a more comprehensive toolset.

Throughout the second half of 2024, Meta has continued to release new features and improvements to the MCL, including collaborative dashboards, text-in-image search, and expanded data scope. These enhancements have been designed to support our users and promote best practices in fact-checking.

To facilitate a seamless transition of our 3PFCs to the MCL, we initiated a proactive outreach and education program. This comprehensive program included a targeted e-Newsletter series, training calls, and live tutorials. 

The education program has yielded encouraging results, with notable increases in usage by 3PFCs. We will continue to monitor the impact of our initiatives and make adjustments as needed to ensure that our users have the support and resources they need to effectively utilize our tools and contribute to a safer and more informed online community. 
As a part of stakeholder engagement initiatives, Meta participated in the EFCSN Conference in Brussels, where we were joined by over 40 of our third-party fact-checking (3PFC) partners from the European Fact-Checking Program. During the conference, we also conducted 20 strategic partner meetings to further strengthen our collaborations and advance our shared goals. 
Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?
No
If yes, which further implementation measures do you plan to put in place in the next 6 months?
As currently drafted, this chapter covers the current practices for Facebook and Instagram in the EU. In keeping with Meta’s public announcements on 7 January 2025, we will continue to assess the applicability of this chapter to Facebook and Instagram and we will keep under review whether it is appropriate to make alterations in light of changes in our practices, such as the deployment of Community Notes.
Measure 30.1
Relevant Signatories will set up agreements between them and independent fact-checking organisations (as defined in whereas (e)) to achieve fact-checking coverage in all Member States. These agreements should meet high ethical and professional standards and be based on transparent, open, consistent and non-discriminatory conditions and will ensure the independence of fact-checkers.
Instagram
QRE 30.1.1
Relevant Signatories will report on and explain the nature of their agreements with fact-checking organisations; their expected results; relevant quantitative information (for instance: contents fact-checked, increased coverage, changes in integration of fact-checking as depends on the agreements and to be further discussed within the Task-force); and such as relevant common standards and conditions for these agreements.
As mentioned in our baseline report, Meta’s fact-checking partners all go through a rigorous certification process with the IFCN. As a subsidiary of the journalism research organisation Poynter Institute, the IFCN is dedicated to bringing fact-checkers together worldwide.
All fact-checking partners follow IFCN’s Code of Principles, a series of commitments they must adhere to in order to promote excellence in fact-checking. 

The detail of our partnership with fact-checkers (i.e., how they rate content and what actions we take as a result) is outlined in QRE 21.1.1 and here.
QRE 30.1.3
Relevant Signatories will report on resources allocated where relevant in each of their services to achieve fact-checking coverage in each Member State and to support fact-checking organisations' work to combat Disinformation online at the Member State level.
As mentioned in our baseline report, the list of fact-checkers with whom we partner across the EU is in QRE 30.1.2. 
SLI 30.1.1
Relevant Signatories will report on Member States and languages covered by agreements with the fact-checking organisations, including the total number of agreements with fact-checking organisations, per language and, where relevant, per service.
Number of individual agreements we have with fact-checking organisations. Each agreement covers both Facebook and Instagram.