Meta

Report March 2026

Submitted
Executive summary

We are pleased to share our seventh report under the 2022 Code of Conduct on Disinformation, which also draws from our work with the Code’s Taskforce. In accordance with the subscription form submitted by Meta Platforms Ireland Limited (Meta) in January 2025, this report is being submitted by Meta in respect of the Facebook, Messenger, and Instagram services and on behalf of WhatsApp Ireland Limited in respect of the WhatsApp messaging service. 

The aim of this report is to provide an update on how Meta approached misinformation and disinformation in the European Union (the EU) and, where relevant,  Norway, Liechtenstein and Iceland (together, the EEA) between July and December 2025. We have additionally included any pertinent updates which occurred after the reporting period, where relevant in the report. Highlights include: 

  • Elections: The National Elections chapter provides an overview of our work on elections within the EEA, detailing our core policies, processes, and implementation strategies. It outlines our comprehensive approach to those elections, which continued for elections held in the second half of 2025. The election responses covered in this report include Norway, Czech Republic, Ireland and the Netherlands elections.

  • Expanding GenAI Transparency for Meta’s Ads Products: We began gradually rolling out “AI Info” labels on ad creative videos using a risk-based framework. When a video is created or significantly edited with our generative AI creative features in our advertiser marketing tools, a label will appear in the three-dot menu or next to the “Sponsored” label. We will continue to evolve our approach to labeling AI-generated content in partnership with experts, advertisers, policy stakeholders and industry partners as people’s expectations and the technology change.

  • Media literacy: Meta published its first Media Literacy Annual Plan on 21 July 2025, which set out its current approach to media literacy in the EU and the products and features we make available to users of Facebook and Instagram. It also provided details on specific media literacy initiatives run by Meta, including its work on digital citizenship, its media literacy lessons in Get Digital, We Think Digital and Soy Digital, and its election literacy programs.

  • Coordinated Inauthentic Behaviour trends: We are sharing insights into a covert influence operation that we disrupted in Poland and Belarus in the second half of 2025. We detected and removed these campaigns before they were able to build authentic audiences on our apps.

Here are a few of the figures which can be found throughout the report:

  • From 01/07/2025 to 31/12/2025, we removed over 11,000,000  ads from Facebook and Instagram, of which over 6,000,000  ads were removed from Facebook and Instagram for violating our misinformation policy.

  • From 01/07/2025 to 31/12/2025, we labelled over 810,000 ads on both Facebook and Instagram with “paid for by” disclaimers.

  • We removed 1 network for violating our Coordinated Inauthentic Behaviour (CIB) policy which targeted one or more countries in the EEA (effectively or potentially). We also took steps to remove fake accounts, prioritising the removal of fake accounts that seek to cause harm. In Q3 2025, we took action against 692M fake accounts and in Q4 2025, we took action against 1.1B fake accounts on Facebook globally. We estimate that fake accounts represented approximately 4% of our worldwide daily active people (DAP) on Facebook during Q3 2025 and 5% during Q4 2025.

This report addresses the practices implemented for Facebook, Instagram, Messenger, and WhatsApp within the EEA during the reporting period of H2 2025. In alignment with Meta's public announcements on 7 January 2025, we continue to evaluate the applicability of these practices to Meta products. We also regularly review the appropriateness of making adjustments in response to changes in our practices, such as the deployment of Community Notes.

Commitment 37
Signatories commit to participate in the permanent Task-force. The Task-force includes the Signatories of the Code and representatives from EDMO and ERGA. It is chaired by the European Commission, and includes representatives of the European External Action Service (EEAS). The Task-force can also invite relevant experts as observers to support its work. Decisions of the Task-force are made by consensus.
We signed up to the following measures of this commitment
Measure 37.1 Measure 37.2 Measure 37.3 Measure 37.4 Measure 37.5 Measure 37.6
In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?
No
If yes, list these implementation measures here
There have been no significant updates since the last submitted report.
Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?
No
If yes, which further implementation measures do you plan to put in place in the next 6 months?
N/A
Measure 37.1
Signatories will participate in the Task-force and contribute to its work. Signatories, in particular smaller or emerging services will contribute to the work of the Task-force proportionate to their resources, size and risk profile. Smaller or emerging services can also agree to pool their resources together and represent each other in the Task-force. The Task-force will meet in plenary sessions as necessary and at least every 6 months, and, where relevant, in subgroups dedicated to specific issues or workstreams.
Facebook, Instagram, WhatsApp, Messenger