Meta

Report March 2026

Submitted
Executive summary

We are pleased to share our seventh report under the 2022 Code of Conduct on Disinformation, which also draws from our work with the Code’s Taskforce. In accordance with the subscription form submitted by Meta Platforms Ireland Limited (Meta) in January 2025, this report is being submitted by Meta in respect of the Facebook, Messenger, and Instagram services and on behalf of WhatsApp Ireland Limited in respect of the WhatsApp messaging service. 

The aim of this report is to provide an update on how Meta approached misinformation and disinformation in the European Union (the EU) and, where relevant,  Norway, Liechtenstein and Iceland (together, the EEA) between July and December 2025. We have additionally included any pertinent updates which occurred after the reporting period, where relevant in the report. Highlights include: 

  • Elections: The National Elections chapter provides an overview of our work on elections within the EEA, detailing our core policies, processes, and implementation strategies. It outlines our comprehensive approach to those elections, which continued for elections held in the second half of 2025. The election responses covered in this report include Norway, Czech Republic, Ireland and the Netherlands elections.

  • Expanding GenAI Transparency for Meta’s Ads Products: We began gradually rolling out “AI Info” labels on ad creative videos using a risk-based framework. When a video is created or significantly edited with our generative AI creative features in our advertiser marketing tools, a label will appear in the three-dot menu or next to the “Sponsored” label. We will continue to evolve our approach to labeling AI-generated content in partnership with experts, advertisers, policy stakeholders and industry partners as people’s expectations and the technology change.

  • Media literacy: Meta published its first Media Literacy Annual Plan on 21 July 2025, which set out its current approach to media literacy in the EU and the products and features we make available to users of Facebook and Instagram. It also provided details on specific media literacy initiatives run by Meta, including its work on digital citizenship, its media literacy lessons in Get Digital, We Think Digital and Soy Digital, and its election literacy programs.

  • Coordinated Inauthentic Behaviour trends: We are sharing insights into a covert influence operation that we disrupted in Poland and Belarus in the second half of 2025. We detected and removed these campaigns before they were able to build authentic audiences on our apps.

Here are a few of the figures which can be found throughout the report:

  • From 01/07/2025 to 31/12/2025, we removed over 11,000,000  ads from Facebook and Instagram, of which over 6,000,000  ads were removed from Facebook and Instagram for violating our misinformation policy.

  • From 01/07/2025 to 31/12/2025, we labelled over 810,000 ads on both Facebook and Instagram with “paid for by” disclaimers.

  • We removed 1 network for violating our Coordinated Inauthentic Behaviour (CIB) policy which targeted one or more countries in the EEA (effectively or potentially). We also took steps to remove fake accounts, prioritising the removal of fake accounts that seek to cause harm. In Q3 2025, we took action against 692M fake accounts and in Q4 2025, we took action against 1.1B fake accounts on Facebook globally. We estimate that fake accounts represented approximately 4% of our worldwide daily active people (DAP) on Facebook during Q3 2025 and 5% during Q4 2025.

This report addresses the practices implemented for Facebook, Instagram, Messenger, and WhatsApp within the EEA during the reporting period of H2 2025. In alignment with Meta's public announcements on 7 January 2025, we continue to evaluate the applicability of these practices to Meta products. We also regularly review the appropriateness of making adjustments in response to changes in our practices, such as the deployment of Community Notes.

Elections 2025
[Note: Signatories are requested to provide information relevant to their particular response to the threats and challenges they observed on their service(s). They ensure that the information below provides an accurate and complete report of their relevant actions. As operational responses to crisis/election situations can vary from service to service, an absence of information should not be considered a priori a shortfall in the way a particular service has responded. Impact metrics are accurate to the best of signatories’ abilities to measure them].
Threats observed or anticipated
Meta is committed to providing reliable election information while combating misinformation across languages on our platforms. Our policies and safeguards for elections have been developed over many years and informed by our experiences of working on more than 200 elections around the world. Those experiences have resulted in the development of a robust election program, which uses mature policies, processes, and tools to both protect speech on our platform and safeguard the integrity of the elections. We continuously improve these measures to ensure they remain appropriate and responsive to emerging risks, and we have reinforced these efforts in light of the regulatory framework set out under the Digital Services Act, the Election Guidelines, and our commitments under this Code.

  1. Community Standards and Guidelines Relevant to Elections:

Our Community Standards set out strict rules for content that can and cannot be posted to our platforms. These policies cover voter interference, voter fraud, electoral violence, and misinformation, among other categories, such as, hateful conduct, coordinating harm and promoting crime, bullying and harassment. Our policies have been refined over many years, by partnering with academics, civil society, and third-party fact-checkers to find the appropriate balance between protecting people and protecting freedom of expression and information. These policies are regularly reviewed, and they are made available to the public through our Transparency Centre.

Our comprehensive approach to elections continued for European elections held between 1 July - 31 December 2025. The election responses covered in this report include:

  1. Norway (Parliamentary) election, 9 September 2025
  2. Czech Republic (Legislative) election, 3 - 4 October 2025
  3. Ireland (Presidential) election, 24 October 2025 
  4. Netherlands, General election for the House of Representatives, 29 October 2025


2. Our Election Risk Management Processes

We have a dedicated team responsible for driving Meta’s cross-company election integrity efforts, leveraging experts from a full range of business functions to foster a holistic approach to tackling election-related risks. Those functions include colleagues in Meta’s intelligence, data science, product and engineering, research, operations, content and public policy, and legal teams. 

Over the years, Meta has developed a comprehensive approach to mitigate relevant user risks and respect the integrity of elections during an election period. This approach has been iterated and has matured over the course of hundreds of elections over the past years. We have processes, tools and policies in place all year round to address harmful or illegal content while protecting legitimate speech on our platforms, which have been further reinforced in light of the regulatory framework under the DSA including the Communication from the Commission (C/2024/3014) on Commission Guidelines on the mitigation of systemic risks for electoral processes (the "Election Guidelines”). 

During the reporting period for this report, we continued to work closely with a full range of external stakeholders to inform our processes and procedures ahead of elections. This included collaboration with Member State Digital Service Coordinators (DSCs), national authorities, electoral bodies, as well as taking part in the EU Code of Practice (“CoP”) Rapid Response System. As part of the rapid response system framework, we onboarded designated civil society organisations and fact checkers to our direct escalation channels to report time sensitive content, accounts or trends that could threaten the integrity of the electoral process. 



Mitigations in place
Overview of  Cooperation with External Stakeholders and Election Integrity Efforts
Meta engages with a full range of external stakeholders to inform our processes and procedures as part of our day-to-day business, and this practice continued during our election preparation and integrity efforts for Norway, Czech Republic, Ireland and the Netherlands. Meta values the networks and channels we have with our external stakeholders to work together in identifying risks on our platforms, and as such, we have welcomed many of the Election Guidelines recommending cooperation and points of contact with national authorities, civil society organisations, and others.

Norway Parliamentary Election

External engagement and election preparation efforts began early, including engagements with the national security authority (Nasjonal sikkerhetsmyndighet), the Organization for Security and Co-operation in Europe (OSCE) and Ministry of Digitalisation and Public Governance. We also conducted training in the Norwegian Parliament for political parties in May 2025 to provide further information on our policies and reporting channels. 

Voter Information Units and Election Day Information Features

We remain focused on providing users with reliable election information while combating misinformation across languages. That is why we continue to connect people with details about the election for their Member State through in-app notifications, where legally permitted. We proactively point users to reliable information on the electoral process through in-app ‘Voter Information Units (VIU)’ and ‘Election Day Information’ reminders (EDR).

Facebook
  • VIU Reach: Over 2.8 million
  • EDR Reach: Over 2.0 million

Instagram
  • VIU Reach: Over 1.9 million
  • EDR Reach: Over 1.4 million

Czech Republic - Legislative Election

External engagement and election preparation efforts began early, including participating in several engagements with stakeholders across government, including: the Ministry of Internal Affairs and the Ministry of Foreign Affairs. Meta also participated in roundtables organised by the Digital Service Coordinator (DSC), with representatives of the European Commission, Czech government, civil society organizations and law enforcement agencies.  We also onboarded the Czech Telecommunication Office to our direct regulatory reporting channel and provided on-the-ground training to Czech authorities on our policies and reporting channels. 

As an active member of the EU Code of Practice on Disinformation Taskforce’s Working Group on Elections, we took part in its Rapid Response System (RRS). Through this, we were regularly in touch with civil society organisations and partners including: Central European Digital Media Observatory, Globsec, Demagog.cz and Alliance4Europe. 

Meta also conducted comprehensive outreach to all political Parties ahead of the election in advance to ensure all candidates’ teams were aware of critical resources, policies and escalation channels on how to contact Meta in case of an escalation. 

Overview of partners and notifications received during the Rapid Response Implementation period (8 September to 13 October 2025):

  • Number of onboarded non-platform signatories to our direct reporting channels: 4.
  • Number of reports received during the election period: 6.

Voter Information Units and Election Day Information Features

Facebook
  • VIU Reach: Over 3.3 million
  • EDR Reach: Over 3.1 million

Instagram
  • VIU Reach: Over 2.8 million
  • EDR Reach: Over 2.6 million

Ireland Presidential Election

External engagement and election preparation efforts began early, including a roundtable hosted by CnaM in September 2025. This included a range of partners, such as  representatives from the European Commission, European Digital Media Observatory (EDMO) and An Garda Siochána.  We were also regularly in touch with civil society organisations and partners, including: Democracy Reporting International and Ireland’s Electoral Commission (An Coimisiún Toghcháin) who we onboarded to our direct regulatory reporting channel.

Overview of partners and notifications received during the Rapid Response Implementation period (29 September - 3 November 2025):

  • Number of onboarded non-platform signatories to our direct reporting channels: 2.
  • Number of reports received during the election period: 59.

Voter Information Units and Election Day Information Features

Facebook
  • VIU Reach: Over 2.0 million
  • EDR Reach: Over 1.5 million

Instagram
  • VIU Reach: Over 2.2 million
  • EDR Reach: Over 1.5 million

Netherlands - General election for the House of Representatives

Overview of partners and notifications received during the Rapid Response Implementation period (1 October to 5 November 2025):

  • Number of onboarded non-platform signatories to our direct reporting channels: 4.
  • Number of reports received during the election period: 1.

External engagement and election preparation efforts began early, including  meetings with the Rijksvoorlichtingsdiens and roundtables with the Authority for Consumers and Markets. We also continued our collaboration with the local, independent fact-checking organisations: dpa-Faktencheck and AFP as part of our election integrity efforts. 

As an active member of the EU Code of Practice on Disinformation Taskforce’s Working Group on Elections, we took part in its Rapid Response System (RRS). Through this, we onboarded the Authority for Consumers and Markets (designated Digital Service Coordinator) to our direct regulatory reporting channel. We also worked closely with the European Commission and non-platform signatories (civil society organisations and fact checkers) to share elections related trends and onboard them to a direct escalation channel to report content which poses serious or systemic concerns to the integrity of the electoral process and support its prompt review.

Voter Information Units and Election Day Information Features

Facebook
  • VIU Reach: Over 5.2 million
  • EDR Reach:  Over 4.4 million

Instagram
  • VIU Reach:  Over 6.9 million
  • EDR Reach: Over 5.9 million

Responsible Approach to Gen AI

Meta’s approach to responsible AI is another way that we are safeguarding the integrity of elections globally, including for the EU national elections.

Community Standards, Fact-Checking, and AI Labelling:

Meta’s Community Standards and Advertising Standards apply to all content, including content generated by AI. AI-generated content is also eligible to be reviewed and rated by Meta’s third-party fact-checking partners, whose rating options allow them to address various ways in which media content may mislead people, including but not limited to media that is created or edited by AI. 

Meta labels photorealistic images created using Meta AI, as well as AI-generated images from certain content creation tools.

Meta has begun labelling a wider range of video, audio, and image content when we detect industry-standard AI image indicators or when users disclose that they are uploading AI-generated content. Meta requires people to use this disclosure and label tool when they post organic content with a photorealistic video or realistic-sounding audio that was digitally created or altered, and may apply penalties if they fail to do so. If Meta determines that digitally created or altered image, video, or audio content creates a particularly high risk of materially deceiving the public on a matter of importance, we may add a more prominent label, so that people have more information and context.

Continuing to Foster AI Transparency through Industry Collaboration:

Meta has also been working with other companies in the tech industry on common standards and guidelines. Meta Platforms, Inc. is a member of the Partnership on AI, for example, and signed onto the tech accord designed to combat the spread of deceptive AI content in 2024 elections globally. Meta receives information from Meta Platforms, Inc. in the progress of these initiatives, and benefits from these partnerships when addressing the risks of manipulated media. 

Scrutiny of Ads Placements
Outline approaches pertinent to this chapter, highlighting similarities/commonalities and differences with regular enforcement.
The measures outlined in Chapters 1 to 3 of this report were in place for the elections covered in this report. They were complemented by the prohibited ads policy outlined above. Most pertinently, under these policies, content that is fact-checked cannot be used for an ad under our Advertising Standards.