Meta

Report September 2025

Submitted
Executive summary

We are pleased to share our sixth report under the 2022 EU Code of Conduct on Disinformation, which also draws from our work with the Code’s Taskforce. In accordance with the subscription form submitted by Meta Platforms Ireland Limited (Meta) in January 2025, this report is being submitted by Meta in respect of the Facebook, Messenger, and Instagram services and on behalf of WhatsApp Ireland Limited in respect of the WhatsApp messaging service. 

The aim of this report is to provide an update on how Meta approached misinformation and disinformation in the European Union between January and June 2025. We have additionally included any pertinent updates which occurred after the reporting period, where relevant in the report. Highlights include: 

  • Elections: The National Elections chapter provides an overview of our work on elections within the EU, detailing our core policies, processes, and implementation strategies. It outlines our comprehensive approach to elections, which continued for European elections held in the first half of 2025. The election responses covered in this report include the parliamentary elections in Germany, the presidential and presidential runoff elections in Romania, the parliamentary elections in Portugal, and the presidential elections in Poland.

  • Expanding GenAI Transparency for Meta’s Ads Products: We began gradually rolling out “AI Info” labels on ad creative videos using a risk-based framework. When a video is created or significantly edited with our generative AI creative features in our advertiser marketing tools, a label will appear in the three-dot menu or next to the “Sponsored” label. We plan to share more information on our approach to labeling ad images made or edited with non-Meta generative AI tools. We will continue to evolve our approach to labeling AI-generated content in partnership with experts, advertisers, policy stakeholders and industry partners as people’s expectations and the technology change.

  • Media literacy: Meta published its first Media Literacy Annual Plan on 21 July 2025, which set out its current approach to media literacy and the products and features we make available to users of Facebook and Instagram. It also provided details on specific media literacy initiatives run by Meta, including its work on digital citizenship, its media literacy lessons in Get Digital, We Think Digital and Soy Digital, and its election literacy programs.

  • Coordinated Inauthentic Behaviour trends: We are sharing insights into a covert influence operation that we disrupted in Romania at the beginning of 2025. We detected and removed this campaign before it was able to build authentic audiences on our apps.

Here are a few of the figures which can be found throughout the report:

  • From 01/01/2025 to 30/06/2025, we removed over 5 million ads from Facebook and Instagram in EU Member States, of which over 83,000 ads were removed from Facebook and Instagram for violating our misinformation policy.


  • From 01/01/2025 to 30/06/2025, we labelled over 1.2 million ads on both Facebook and Instagram with “paid for by” disclaimers in the EU.

  • We removed 1 network for violating our Coordinated Inauthentic Behaviour (CIB) policy which targeted one or more European countries (effectively or potentially). We also took steps to remove fake accounts, prioritising the removal of fake accounts that seek to cause harm. In Q1 2025, we took action against 1 billion fake accounts and in Q2 2025, we took action against 687 million fake accounts on Facebook globally. We estimate that fake accounts represented approximately 3% of our worldwide monthly active users (MAU) on Facebook during Q1 2025 and 4% during Q2 2025.


This report addresses the practices implemented for Facebook, Instagram, Messenger, and WhatsApp within the EU during the reporting period of H1 2025. In alignment with Meta's public announcements on 7 January 2025, we will continue to evaluate the applicability of these practices to Meta products. We will also regularly review the appropriateness of making adjustments in response to changes in our practices, such as the deployment of Community Notes.



Download PDF

Elections 2025
[Note: Signatories are requested to provide information relevant to their particular response to the threats and challenges they observed on their service(s). They ensure that the information below provides an accurate and complete report of their relevant actions. As operational responses to crisis/election situations can vary from service to service, an absence of information should not be considered a priori a shortfall in the way a particular service has responded. Impact metrics are accurate to the best of signatories’ abilities to measure them].
Threats observed or anticipated
Meta is committed to providing reliable election information while combating misinformation across languages on our platforms. Our policies and safeguards for elections have been developed over many years and informed by our experiences of working on more than 200 elections around the world. Those experiences have resulted in the development of a robust election program, which uses mature policies, processes, and tools to both protect speech on our platform and safeguard the integrity of the elections. We continuously improve these measures to ensure they remain appropriate and responsive to emerging risks, and we have reinforced these efforts in light of the regulatory framework set out under the Digital Services Act, the Election Guidelines, and our commitments under this Code.


Community Standards and Guidelines Relevant to Elections: 


Our Community Standards set out strict rules for content that can and cannot be posted to our platforms. These policies cover voter interference, voter fraud, electoral violence, and misinformation, among other categories such as hateful conduct, coordinating harm and promoting crime, bullying and harassment. Our policies have been refined over many years, by partnering with academics, civil society, and third-party fact-checkers to find the appropriate balance between protecting people and protecting freedom of expression and information. These policies are regularly reviewed, and they are made available to the public through our Transparency Centre.


Our comprehensive approach to elections continued for European elections held between 1 January - 30 June 2025. The election responses covered in this report include:

  1. Germany (Parliamentary), 23 February 2025
  2. Romania (Presidential), 4 May, 2025
  3. Romania (Presidential Runoff), 18 May 2025
  4. Portugal (Parliamentary), 18 May 2025
  5. Poland (Presidential), 18 May 2025

Mitigations in place
Our Election Risk Management Processes

We have a dedicated team responsible for driving Meta’s cross-company election integrity efforts, leveraging experts from a full range of business functions to foster a holistic approach to tackling election-related risks. Those functions include colleagues in Meta’s intelligence, data science, product and engineering, research, operations, content and public policy, and legal teams. 


Building on our experience of the 2024 European Parliament (EP) Elections, we continued to conduct in depth preparations and risk assessments for elections covered in this reporting period, deploy mitigation measures and utilise our Election Operation Centres established to address risks in real time ahead of the elections day. 

We continued to work closely with a full range of external stakeholders to inform our processes and procedures ahead of elections. This included collaboration with Member State Digital Service Coordinators (DSCs), national authorities, electoral bodies, as well as taking part in the EU Code of Practice (“CoP”) Rapid Response System. As part of the rapid response system framework, we onboarded designated civil society organisations and fact checkers to our direct escalation channels to report time sensitive content, accounts or trends that could threaten the integrity of the electoral process.

Overview of  Cooperation with External Stakeholders and Election Integrity Efforts

Meta engages with a full range of external stakeholders to inform our processes and procedures as part of our day-to-day business, and this practice continued during our election preparation and integrity efforts for Germany, Romania, Portugal and Poland. Meta values the networks and channels we have with our external stakeholders to work together in identifying risks on our platforms, and as such, we have welcomed many of the Election Guidelines recommending cooperation and points of contact with national authorities, civil society organisations, and others.


Germany

External engagement and election preparations started in the second half of 2023 as part of Meta’s overall 2024 EU Parliamentary election integrity efforts. For the 2025 German Federal elections, these efforts included participating in over 15 engagements with German and EU Level authorities including: the Ministry of Interior (MoI), the German Digital Service Coordinator (DSC) “BNetzA”, the German Government Election Taskforce, the Federal Office for Information Security (BSI), the Federal Returning Officer and German intelligence services. Many of these engagements took the form of bilateral meetings or round tables chaired by the German MoI, or the BNetzA in close partnership with the European Commission. We onboarded the German DSC to our direct regulatory reporting channel, as well as other relevant authorities such as the Federal Returning Officer, the BSI or the For the German election taskforce. 

Meta also participated in the 'DSA Stress Test: Tabletop Exercise on German Elections' organised by the German DSC, BNetzA, and the European Commission. The event brought together representatives from social media platforms, as well as national authorities and civil society organisations, who took part in the stress test exercise.

In close partnership with the BSI, Meta organised a capability building session reaching more than 90 MPs, members of Parliamentary Groups as well as candidates. The training focussed on Meta’s election preparedness narrative, our advertising and organic best practices, business messaging and general safety updates to raise awareness of security threats and possible misinformation and disinformation campaigns around the elections.
Ahead of the German Elections, we also partnered with the German Returning Officer to support their “Get out the vote” campaign through ad credits, reaching almost 18 million users and more than 26 million video plays.

Overview of partners and notifications received during the Rapid Response Implementation period (6 February to 5 March):

  • Number of onboarded non-platform signatories to our direct reporting channels: 6.
  • Number of reports received during the election period through the rapid response system: 18.

Voter Information Units and Election Day Information Features

We remain focused on providing users with reliable election information while combating misinformation across languages. That is why we continue to connect people with details about the election for their Member State through in-app notifications, where legally permitted. We proactively point users to reliable information on the electoral process through in-app ‘Voter Information Units (VIU)’ and ‘Election Day Information’ reminders (EDR).

FacebookVIU Reach: Over 17.4 millionEDR Reach:  Over 11.3 million | InstagramVIU Reach:  Over 29.5 millionEDR Reach: Over 23.4 million



Romania (First Round and Run Off)

The election preparations efforts for Romania started in the second half of 2023 as part of the overall 2024 EU Parliamentary elections preparation efforts, and continued until the 2025 Presidential election. Meta engaged with a full range of external stakeholders to inform our processes and procedures. This included regular engagement with Romania’s Permanent Electoral Authority (PEA), the Romanian Digital Service Coordinator “AnCOM,” the Ministry of Digitalisation, Research and Innovation, the Ministry of Interior, the Cyber Security Directorate (DNSC) and the Audiovisual Council, all of whom were onboarded to our direct regulatory escalation channels where they were able to report content. 

As an active member of the EU Code of Practice on Disinformation Taskforce’s Working Group on Elections, we took part in its Rapid Response System. Through this, we were regularly in touch with civil society organisations from Romania through various meetings and roundtables organised by the Disinformation working group.  In addition to this, we also participated in a Stress Test organised by AnCOM in Bucharest, alongside national authorities, civil society and other very large online platforms. 

Meta also engaged Romanian Political Parties in advance by organising online training sessions on our policies and products, including how to contact Meta in case of an escalation. Meta also created a direct escalation channel for 5 Romanian partners to report Community Standards violations and unlawful content, and collaborated  with the Electoral Body to support civic engagement for Romanian users and connect people with reliable information about voting.


Overview of partners and notifications received during the Rapid Response Implementation period:  (7 April to 25 May)


  • Number of onboarded non-platform signatories to our direct reporting channels: 7.
  • Number of reports received during the election period: 60.

Voter Information Units and Election Day Information Features

FacebookFirst RoundEDR Reach: Over 6.9 millionRunoff EDR Reach: Over 7.2 million | Instagram First round:EDR Reach: Over 3.0 millionRunoff EDR Reach: Over 2.9 million


Portugal

Election preparations efforts for the 2025 Portugal election started in the first half of 2024, following the announcement of the snap election. This included establishing formal communication channels with the Portugal Electoral Commission (CNE), the Portuguese Digital Service Coordinator, “ANACOM,”and comprehensive outreach to each political party to ensure that political parties and candidates' teams were aware of the critical resources, policies, and escalation channels.  
Meta also supported civic engagement for Portuguese users by collaborating with the Portuguese Electoral Body to connect people with reliable information about voting. In the lead up to the election and on the election day, Meta showed on top of feed notifications on both Facebook and Instagram to all users in Portugal to redirect them to their website.

Overview of partners and notifications received during the Rapid Response Implementation period  (7 April to 25 May):

  • Number of onboarded non-platform signatories to our direct reporting channels: 2.
  • Number of reports received during the election period: 2.

Voter Information Units and Election Day Information Features

FacebookVIU Reach: Over 4.6 millionEDR Reach: Over 3.3 million | InstagramVIU Reach: Over 5.2 millionEDR Reach: Over 4.3 million


Poland

Meta conducted a series of targeted initiatives to enhance external collaboration with key stakeholders ahead of the Polish election, engaging close to 200 stakeholders from across government, politics, academia, and NGOs. This included engagements and workshops with: the National Electoral Office, representatives from the Cybersecurity Directorate, Internal Intelligence, Counter Espionage, and Police HQ to streamline cooperation in processing data requests and escalations. We also conducted engagements with the Polish Research and Academic Computer Network (NASK) and Ministry of Foreign Affairs to aid understanding of our content escalation channel and provide training on key content policies to improve moderation and reporting.

In addition to this, one-on-one workshops with teams from each registered presidential candidate were arranged by Meta. These sessions focused on clarifying our policies, establishing communication channels, and providing access to "Meta Support Pro" for priority technical issue resolution. Training was also provided to the Polish Ministries, including with the Chancellery of the Prime Minister, to train their communication teams on effective cybersecurity incident prevention.

As part of the rapid response system, Meta maintained regular contact with civil society organisations and created a direct escalation channel for Polish partners to report Community Standard violating and unlawful content. This included: Alliance4europe, CEE Digital Democracy Watch, GLOBSEC and DEMAGOG. 


Overview of partners and notifications received during the Rapid Response Implementation period (22 April to 24 June 2025):

  • Number of onboarded non-platform signatories to our direct reporting channels: 4.
  • Number of reports received during the election period: 14.

Voter Information Units and Election Day Information Features

FacebookFirst RoundEDR Reach: Over 12.8 millionRunoff EDR Reach: Over 12.4 million | InstagramFirst RoundEDR Reach: Over 7.3 millionRunoff EDR Reach: Over 7.0 million


Responsible Approach to Gen AI


Meta’s approach to responsible AI is another way that we are safeguarding the integrity of elections globally, including for the EU national elections.

Community Standards, Fact-Checking, and AI Labelling:


Meta’s Community Standards and Advertising Standards apply to all content, including content generated by AI. AI-generated content is also eligible to be reviewed and rated by Meta’s third-party fact-checking partners, whose rating options allow them to address various ways in which media content may mislead people, including but not limited to media that is created or edited by AI. 


Meta labels photorealistic images created using Meta AI, as well as AI-generated images from certain content creation tools.


Meta has begun labelling a wider range of video, audio, and image content when we detect industry-standard AI image indicators or when users disclose that they are uploading AI-generated content. Meta requires people to use this disclosure and label tool when they post organic content with a photorealistic video or realistic-sounding audio that was digitally created or altered, and may apply penalties if they fail to do so. If Meta determines that digitally created or altered image, video, or audio content creates a particularly high risk of materially deceiving the public on a matter of importance, we may add a more prominent label, so that people have more information and context.


Continuing to Foster AI Transparency through Industry Collaboration:

Meta has also been working with other companies in the tech industry on common standards and guidelines. Meta Platforms, Inc. is a member of the Partnership on AI, for example, and signed onto the tech accord designed to combat the spread of deceptive AI content in 2024 elections globally. Meta receives information from Meta Platforms, Inc. in the progress of these initiatives, and benefits from these partnerships when addressing the risks of manipulated media.



Scrutiny of Ads Placements
Outline approaches pertinent to this chapter, highlighting similarities/commonalities and differences with regular enforcement.
The measures outlined in Chapters 1 to 3 of this report were in place for the elections covered in this report. They were complemented by the prohibited ads policy outlined above. Most pertinently, under these policies, content that is fact-checked cannot be used for an ad under our Advertising Standards.