Facebook

Report March 2026

Submitted

Your organisation description

Advertising

Commitment 1

Relevant signatories participating in ad placements commit to defund the dissemination of disinformation, and improve the policies and systems which determine the eligibility of content to be monetised, the controls for monetisation and ad placement, and the data to report on the accuracy and effectiveness of controls and services around ad placements.

We signed up to the following measures of this commitment

Measure 1.3 Measure 1.5

In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?

No

If yes, list these implementation measures here

N/A

Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?

Yes

If yes, which further implementation measures do you plan to put in place in the next 6 months?

  • We are in the process of expanding advertiser delivery reports to more Facebook ad placements.
  • We plan to expand integrations with our third-party partners to introduce additional functionality.

Commitment 2

Relevant Signatories participating in advertising commit to prevent the misuse of advertising systems to disseminate Disinformation in the form of advertising messages.

We signed up to the following measures of this commitment

Measure 2.1 Measure 2.2 Measure 2.3

In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?

No

If yes, list these implementation measures here

There have been no significant updates since the last submitted report.

Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?

No

If yes, which further implementation measures do you plan to put in place in the next 6 months?

N/A

Measure 2.1

Relevant Signatories will develop, deploy, and enforce appropriate and tailored advertising policies that address the misuse of their advertising systems for propagating harmful Disinformation in advertising messages and in the promotion of content.

Facebook

QRE 2.1.1

Signatories will disclose and outline the policies they develop, deploy, and enforce to meet the goals of Measure 2.1 and will link to relevant public pages in their help centres.

As noted in our baseline report, advertisers that are running ads across Meta technologies must follow our Terms of Service, our Community Standards and our Advertising Standards. As such, Misinformation is considered to be unacceptable content under our Advertising Standards.  See more here. 

SLI 2.1.1

Signatories will report, quantitatively, on actions they took to enforce each of the policies mentioned in the qualitative part of this service level indicator, at the Member State or language level. This could include, for instance, actions to remove, to block, or to otherwise restrict harmful Disinformation in advertising messages and in the promotion of content.

  1. Number of Ads removed  on Facebook and Instagram combined for violating our harmful health misinformation or inauthentic behavior or voter or census interference policies in the EEA from 01/07/2025 to 31/12/2025.*
  2. Overall number of Ads removed on Facebook and Instagram combined (in the EEA) from 01/07/2025 to 31/12/2025.

Country Number of Ads removed on Facebook and Instagram combined for violating our harmful health misinformation or inauthentic behavior or voter or census interference policies in the EEA from 01/07/2025 to 31/12/2025 Overall number of Ads removed on Facebook and Instagram combined (in the EEA) from 01/07/2025 to 31/12/2025. Type of Action 3 Type of Action 4
Austria Over 7,000 Over 350,000 0 0
Belgium Over 15,000 Over 200,000 0 0
Bulgaria Over 4,600 Over 1,500,000 0 0
Croatia Over 2,300 Over 580,000 0 0
Cyprus Over 4,800 Over 250,000 0 0
Czech Republic Over 10,000 Over 96,000 0 0
Denmark Over 4,800 Over 180,000 0 0
Estonia Over 2,700 Over 700 0 0
Finland Over 3,200 Over 660,000 0 0
France Over 17,000 Over 460,000 0 0
Germany Over 23,000 Over 21,000 0 0
Greece Over 5,000 Over 100,000 0 0
Hungary Over 30,000 Over 220,000 0 0
Ireland Over 3,900 Over 1,100,000 0 0
Italy Over 25,000 Over 210,000 0 0
Latvia Over 2,200 Over 480,000 0 0
Lithuania Over 3,200 Over 340,000 0 0
Luxembourg Over 680 Over 89,000 0 0
Malta Over 870 Over 100,000 0 0
Netherlands Over 7,500 Over 640,000 0 0
Poland Over 15,000 Over 350,000 0 0
Portugal Over 2,500 Over 46,000 0 0
Romania Over 28,000 Over 150,000 0 0
Slovakia Over 7,600 Over 230,000 0 0
Slovenia Over 980 Over 440,000 0 0
Spain Over 9,900 Over 730,000 0 0
Sweden Over 8,700 Over 1,000,000 0 0
Iceland Over 4,800 Over 350,000 0 0
Liechtenstein Less than 100 Over 44,000 0 0
Norway Over 3,200 Over 96,000 0 0

Measure 2.2

Relevant Signatories will develop tools, methods, or partnerships, which may include reference to independent information sources both public and proprietary (for instance partnerships with fact-checking or source rating organisations, or services providing indicators of trustworthiness, or proprietary methods developed internally) to identify content and sources as distributing harmful Disinformation, to identify and take action on ads and promoted content that violate advertising policies regarding Disinformation mentioned in Measure 2.1.

Facebook

QRE 2.2.1

Signatories will describe the tools, methods, or partnerships they use to identify content and sources that contravene policies mentioned in Measure 2.1 - while being mindful of not disclosing information that'd make it easier for malicious actors to circumvent these tools, methods, or partnerships. Signatories will specify the independent information sources involved in these tools, methods, or partnerships.

As noted in our baseline report, misinformation is considered to be unacceptable content under our Advertising Standards, and as such those types of content are ineligible to monetise: See our Advertising Standards for more information. 

In the EU, Meta’s third party fact-checkers may review ads posted on Facebook, labelling them where a falsity assessment has concluded that they are false.


Measure 2.3

Relevant Signatories will adapt their current ad verification and review systems as appropriate and commercially feasible, with the aim of preventing ads placed through or on their services that do not comply with their advertising policies in respect of Disinformation to be inclusive of advertising message, promoted content, and site landing page.

Facebook

QRE 2.3.1

Signatories will describe the systems and procedures they use to ensure that ads placed through their services comply with their advertising policies as described in Measure 2.1.

As mentioned in our baseline report, the ad review system checks ads for violations of our policies. This review process may include the specific components of an ad, such as images, video, text and targeting information, as well as an ad's associated landing page or other destinations, among other information.

More specifically, once fact-checking partners have determined that a piece of content contains misinformation, we use technology to identify identical and near-identical versions across Facebook. If we find ads that are  identical or near identical to content fact-checkers have rated, we reject them.
Number of Ads removed  on Facebook and Instagram combined for violating our harmful health misinformation or inauthentic behavior or voter or census interference policies in the EEA from 01/07/2025 to 31/12/2025.*Overall number of Ads removed on Facebook and Instagram combined (in the EEA) from 01/07/2025 to 31/12/2025.

  1. Number of Ads removed  on Facebook and Instagram combined for violating our harmful health misinformation or inauthentic behavior or voter or census interference policies in the EEA from 01/07/2025 to 31/12/2025.*
  2. Overall number of Ads removed on Facebook and Instagram combined (in the EEA) from 01/07/2025 to 31/12/2025.

SLI 2.3.1

Signatories will report quantitatively, at the Member State level, on the ads removed or prohibited from their services using procedures outlined in Measure 2.3. In the event of ads successfully removed, parties should report on the reach of violatory content and advertising.

Number of Ads removed on Facebook and Instagram combined for violating our harmful health misinformation or inauthentic behavior or voter or census interference policies in the EEA from 01/07/2025 to 31/12/2025.*

Overall number of Ads removed on Facebook and Instagram combined (in the EEA) from 01/07/2025 to 31/12/2025.

Country Number of Ads removed on Facebook and Instagram combined for violating our harmful health misinformation or inauthentic behavior or voter or census interference policies in the EEA from 01/07/2025 to 31/12/2025.* Overall number of Ads removed on Facebook and Instagram combined (in the EEA) from 01/07/2025 to 31/12/2025.
Austria Over 7,000 Over 350,000
Belgium Over 15,000 Over 200,000
Bulgaria Over 4,600 Over 1,500,000
Croatia Over 2,300 Over 580,000
Cyprus Over 4,800 Over 250,000
Czech Republic Over 10,000 Over 96,000
Denmark Over 4,800 Over 180,000
Estonia Over 2,700 Over 700
Finland Over 3,200 Over 660,000
France Over 17,000 Over 460,000
Germany Over 23,000 Over 21,000
Greece Over 5,000 Over 100,000
Hungary Over 30,000 Over 220,000
Ireland Over 3,900 Over 1,100,000
Italy Over 25,000 Over 210,000
Latvia Over 2,200 Over 480,000
Lithuania Over 3,200 Over 340,000
Luxembourg Over 680 Over 89,000
Malta Over 870 Over 100,000
Netherlands Over 7,500 Over 640,000
Poland Over 15,000 Over 350,000
Portugal Over 2,500 Over 46,000
Romania Over 28,000 Over 150,000
Slovakia Over 7,600 Over 230,000
Slovenia Over 980 Over 440,000
Spain Over 9,900 Over 730,000
Sweden Over 8,700 Over 1,000,000
Iceland Over 4,800 Over 350,000
Liechtenstein Less than 100 Over 44,000
Norway Over 3,200 Over 96,000

Commitment 3

Relevant Signatories involved in buying, selling and placing digital advertising commit to exchange best practices and strengthen cooperation with relevant players, expanding to organisations active in the online monetisation value chain, such as online e-payment services, e-commerce platforms and relevant crowd-funding/donation systems, with the aim to increase the effectiveness of scrutiny of ad placements on their own services.

We signed up to the following measures of this commitment

Measure 3.1 Measure 3.2 Measure 3.3

In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?

No

If yes, list these implementation measures here

There have been no significant updates since the last submitted report.


Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?

No

If yes, which further implementation measures do you plan to put in place in the next 6 months?

N/A

Measure 3.1

Relevant Signatories will cooperate with platforms, advertising supply chain players, source-rating services, services that provide indicators of trustworthiness, fact-checking organisations, advertisers and any other actors active in the online monetisation value chain, to facilitate the integration and flow of information, in particular information relevant for tackling purveyors of harmful Disinformation, in full respect of all relevant data protection rules and confidentiality agreements.

Facebook

QRE 3.1.1

Signatories will outline how they work with others across industry and civil society to facilitate the flow of information that may be relevant for tackling purveyors of harmful Disinformation.

We are engaging closely with the Taskforce on the topic of demonetisation and working closely with IAB Europe.

Measure 3.2

Relevant Signatories will exchange among themselves information on Disinformation trends and TTPs (Tactics, Techniques, and Procedures), via the Code Task-force, GARM, IAB Europe, or other relevant fora. This will include sharing insights on new techniques or threats observed by Relevant Signatories, discussing case studies, and other means of improving capabilities and steps to help remove Disinformation across the advertising supply chain - potentially including real-time technical capabilities.

Facebook

QRE 3.2.1

Signatories will report on their discussions within fora mentioned in Measure 3.2, being mindful of not disclosing information that is confidential and/or that may be used by malicious actors to circumvent the defences set by Signatories and others across the advertising supply chain. This could include, for instance, information about the fora Signatories engaged in; about the kinds of information they shared; and about the learnings they derived from these exchanges.

We are evaluating potential partnership opportunities and will provide further updates as they become available.

Measure 3.3

Relevant Signatories will integrate the work of or collaborate with relevant third-party organisations, such as independent source-rating services, services that provide indicators of trustworthiness, fact-checkers, researchers, or open-source investigators, in order to reduce monetisation of Disinformation and avoid the dissemination of advertising containing Disinformation.

Facebook

QRE 3.3.1

Signatories will report on the collaborations and integrations relevant to their work with organisations mentioned.

There have been no significant updates since the last submitted report.

Political Advertising

Commitment 6

Relevant Signatories commit to make political or issue ads clearly labelled and distinguishable as paid-for content in a way that allows users to understand that the content displayed contains political or issue advertising.

We signed up to the following measures of this commitment

Measure 6.1 Measure 6.2 Measure 6.3 Measure 6.4 Measure 6.5

In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?

Yes

If yes, list these implementation measures here

As  announced in July 2025, since October 6, 2025, Meta no longer allows political, electoral and social issue ads on our platforms in the EU, given the unworkable requirements and legal uncertainties introduced by the EU’s Transparency and Targeting of Political Advertising regulation.

Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?

N/A

If yes, which further implementation measures do you plan to put in place in the next 6 months?

N/A

Measure 6.1

Relevant Signatories will develop a set of common best practices and examples for marks and labels on political or issue ads and integrate those learnings as relevant to their services.

Facebook

QRE 6.1.1

Relevant Signatories will publicise the best practices and examples developed as part of Measure 2.2.1 and describe how they relate to their relevant services.

As announced in July 2025, since October 6, 2025, Meta no longer allows political, electoral and social issue ads on our platforms in the EU, given the unworkable requirements and legal uncertainties introduced by the EU’s Transparency and Targeting of Political Advertising regulation.

Measure 6.2

Relevant Signatories will ensure that relevant information, such as the identity of the sponsor, is included in the label attached to the ad or is otherwise easily accessible to the user from the label.

Facebook

QRE 6.2.1

Relevant Signatories will publish examples of how sponsor identities and other relevant information are attached to ads or otherwise made easily accessible to users from the label.

As announced in July 2025, since October 6, 2025, Meta no longer allows political, electoral and social issue ads on our platforms in the EU, given the unworkable requirements and legal uncertainties introduced by the EU’s Transparency and Targeting of Political Advertising regulation.

QRE 6.2.2

Relevant Signatories will publish their labelling designs.

As noted in our baseline report, examples of political ad labelling may be found in the Ad Library. 

SLI 6.2.1

Relevant Signatories will publish meaningful metrics, at Member State level, on the volume of ads labelled according to Measure 6.2, such as the number of ads accepted and labelled, amounts spent by labelled advertisers, or other metrics to be determined in discussion within the Task-force with the aim to assess the efficiency of this labelling.

Number of unique SIEP ads on Facebook and Instagram combined displaying “paid for by” disclaimers from 01/07/2025 to 31/12/2025 in EEA Member States.

Country determined by inferred advertiser location at time of enforcement.

Country Number of ads accepted & labelled on Facebook and Instagram combined Other relevant metrics
Austria Over 15,000 0 0
Belgium Over 66,000 0 0
Bulgaria Over 2,200 0 0
Croatia Over 10,000 0 0
Cyprus Over 2,000 0 0
Czech Republic Over 24,000 0 0
Denmark Over 22,000 0 0
Estonia Over 5,600 0 0
Finland Over 5,800 0 0
France Over 17,000 0 0
Germany Over 47,000 0 0
Greece Over 13,000 0 0
Hungary Over 47,000 0 0
Ireland Over 6,700 0 0
Italy Over 42,000 0 0
Latvia Over 3,000 0 0
Lithuania Over 3,200 0 0
Luxembourg Over 460 0 0
Malta Over 1,300 0 0
Netherlands Over 350,000 0 0
Poland Over 21,000 0 0
Portugal Over 14,000 0 0
Romania Over 10,000 0 0
Slovakia Over 19,000 0 0
Slovenia Over 1,600 0 0
Spain Over 13,000 0 0
Sweden Over 17,000 0 0
Iceland Over 870 0 0
Liechtenstein Less than 100 0 0
Norway Over 20,000 0 0

Measure 6.3

Relevant Signatories will invest and participate in research to improve users's identification and comprehension of labels, discuss the findings of said research with the Task-force, and will endeavour to integrate the results of such research into their services where relevant.

Facebook

QRE 6.3.1

Relevant Signatories will publish relevant research into understanding how users identify and comprehend labels on political or issue ads and report on the steps they have taken to ensure that users are consistently able to do so and to improve the labels' potential to attract users' awareness.

As announced in July 2025, since October 6, 2025, Meta no longer allows political, electoral and social issue ads on our platforms in the EU, given the unworkable requirements and legal uncertainties introduced by the EU’s Transparency and Targeting of Political Advertising regulation.

Measure 6.4

Relevant Signatories will ensure that once a political or issue ad is labelled as such on their platform, the label remains in place when users share that same ad on the same platform, so that they continue to be clearly identified as paid-for political or issue content.

Facebook

QRE 6.4.1

Relevant Signatories will describe the steps they put in place to ensure that labels remain in place when users share ads.

As announced in July 2025, since October 6, 2025, Meta no longer allows political, electoral and social issue ads on our platforms in the EU, given the unworkable requirements and legal uncertainties introduced by the EU’s Transparency and Targeting of Political Advertising regulation.

Measure 6.5

Relevant Signatories that provide messaging services will, where possible and when in compliance with local law, use reasonable efforts to work towards improving the visibility of labels applied to political advertising shared over messaging services. To this end they will use reasonable efforts to develop solutions that facilitate users recognising, to the extent possible, paid-for content labelled as such on their online platform when shared over their messaging services, without any weakening of encryption and with due regard to the protection of privacy.

N/A

QRE 6.5.1

Relevant Signatories will report on any solutions in place to empower users to recognise paid-for content as outlined in Measure 6.5.

N/A

Commitment 7

Relevant Signatories commit to put proportionate and appropriate identity verification systems in place for sponsors and providers of advertising services acting on behalf of sponsors placing political or issue ads. Relevant signatories will make sure that labelling and user-facing transparency requirements are met before allowing placement of such ads.

We signed up to the following measures of this commitment

Measure 7.1 Measure 7.2 Measure 7.3 Measure 7.4

In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?

Yes

If yes, list these implementation measures here

As announced in July 2025, since October 6, 2025, Meta no longer allows political, electoral and social issue ads on our platforms in the EU, given the unworkable requirements and legal uncertainties introduced by the EU’s Transparency and Targeting of Political Advertising regulation.

Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?

N/A

If yes, which further implementation measures do you plan to put in place in the next 6 months?

N/A

Measure 7.1

Relevant Signatories will make sure the sponsors and providers of advertising services acting on behalf of sponsors purchasing political or issue ads have provided the relevant information regarding their identity to verify (and re-verify where appropriate) said identity or the sponsors they are acting on behalf of before allowing placement of such ads.

Facebook

QRE 7.1.1

Relevant Signatories will report on the tools and processes in place to collect and verify the information outlined in Measure 7.1.1, including information on the timeliness and proportionality of said tools and processes.

As announced in July 2025, since October 6, 2025, Meta no longer allows political, electoral and social issue ads on our platforms in the EU, given the unworkable requirements and legal uncertainties introduced by the EU’s Transparency and Targeting of Political Advertising regulation.

SLI 7.1.1

Relevant Signatories will publish meaningful metrics on the volume of ads rejected for failure to fulfil the relevant verification processes, comparable to metrics for SLI 6.2.1, where relevant per service and at Member State level.

Number of unique Ads removed for not complying with our policy on SIEP ads on both Facebook and Instagram from 01/07/2025 to 31/12/2025 in EEA Member States.

Number of unique Ads removed for not complying with our policy on SIEP ads on both Facebook and Instagram from 01/07/2025 to 31/12/2025 in EEA Member States.

Country Number of unique Ads removed for not complying with our policy on SIEP ads on both Facebook and Instagram from 01/07/2025 to 31/12/2025 in EEA Member States. Other relevant metrics
Austria Over 290,000 0
Belgium Over 260,000 0
Bulgaria Over 63,000 0
Croatia Over 79,000 0
Cyprus Over 44,000 0
Czech Republic Over 220,000 0
Denmark Over 230,000 0
Estonia Over 42,000 0
Finland Over 160,000 0
France Over 260,000 0
Germany Over 770,000 0
Greece Over 190,000 0
Hungary Over 480,000 0
Ireland Over 68,000 0
Italy Over 800,000 0
Latvia Over 32,000 0
Lithuania Over 47,000 0
Luxembourg Over 15,000 0
Malta Over 25,000 0
Netherlands Over 270,000 0
Poland Over 340,000 0
Portugal Over 72,000 0
Romania Over 370,000 0
Slovakia Over 180,000 0
Slovenia Over 33,000 0
Spain Over 260,000 0
Sweden Over 330,000 0
Iceland Over 1,800 0
Liechtenstein Over 330 0
Norway Over 12,000 0

Measure 7.2

Relevant Signatories will complete verifications processes described in Commitment 7 in a timely and proportionate manner.

Facebook

QRE 7.2.1

Relevant Signatories will report on the actions taken against actors demonstrably evading the said tools and processes, including any relevant policy updates.

  • As mentioned in our Advertising standards, we enforce our policies against all advertisers, and as a general rule, advertisers must not evade or attempt to evade our review process and enforcement actions. 
  • As announced in July 2025, since October 6, 2025, Meta no longer allows political, electoral and social issue ads on our platforms in the EU, given the unworkable requirements and legal uncertainties introduced by the EU’s Transparency and Targeting of Political Advertising regulation.


QRE 7.2.2

Relevant Signatories will provide information on the timeliness and proportionality of the verification process.

As mentioned in our baseline report, details for country-specific ID verification processes may be found online on our Business Help Centre.

An advertiser must confirm their identity and link an ad account with a Page using a valid disclaimer to complete authorization. The review process is usually within 48 hours and disclaimer reviews are typically completed within 24 hours. 

Measure 7.3

Relevant Signatories will take appropriate action, such as suspensions or other account-level penalties, against political or issue ad sponsors who demonstrably evade verification and transparency requirements via on-platform tactics. Relevant Signatories will develop - or provide via existing tools - functionalities that allow users to flag ads that are not labelled as political.

Facebook

QRE 7.3.1

Relevant Signatories will report on the tools and processes in place to request a declaration on whether the advertising service requested constitutes political or issue advertising.

As mentioned in our baseline report: 
  • All ads are subject to our ad review system before they're shown on Facebook against our Advertising Standards
  • In certain cases, a post or ad that's already running can be flagged by AI or reported by our community. If this happens, the content may be reviewed again, and if found to be in violation of our policies and/or the ad is missing a “Paid for by” disclaimer, we disapprove it. 

The Community Standards prohibit ads that promote voter interference.

QRE 7.3.2

Relevant Signatories will report on policies in place against political or issue ad sponsors who demonstrably evade verification and transparency requirements on-platform.

As mentioned in our baseline report, our Advertising Standards make clear that we enforce our policies against all advertisers, and as a general rule, advertisers must not evade or attempt to evade our review process and enforcement actions. If we find that an ad account, Page, user account or business account is evading our review process and enforcement actions, an advertiser may face advertising restrictions. 

As announced in July 2025, since October 6, 2025, Meta no longer allows political, electoral and social issue ads on our platforms in the EU, given the unworkable requirements and legal uncertainties introduced by the EU’s Transparency and Targeting of Political Advertising regulation.

Measure 7.4

Relevant Signatories commit to request that sponsors, and providers of advertising services acting on behalf of sponsors, declare whether the advertising service they request constitutes political or issue advertising.

Facebook

QRE 7.4.1

Relevant Signatories will report on research and publish data on the effectiveness of measures they take to verify the identity of political or issue ad sponsors.

Please refer to QRE 7.1.1 and SLI 7.1.1.

Commitment 8

Relevant Signatories commit to provide transparency information to users about the political or issue ads they see on their service.

We signed up to the following measures of this commitment

Measure 8.1 Measure 8.2

In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?

Yes

If yes, list these implementation measures here

As announced in July 2025, since October 6, 2025, Meta no longer allows political, electoral and social issue ads on our platforms in the EU, given the unworkable requirements and legal uncertainties introduced by the EU’s Transparency and Targeting of Political Advertising regulation.

Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?

N/A

If yes, which further implementation measures do you plan to put in place in the next 6 months?

N/A

Measure 8.1

Relevant Signatories will agree on the common minimum transparency obligations, seeking alignment with the European Commission's proposal for a Regulation on the transparency and targeting of political advertising, such as identification of the sponsor, display period, ad spend, and aggregate information on recipients of the ad.

Facebook

Measure 8.2

Relevant Signatories will provide a direct link from the ad to the ad repository.

Facebook

QRE 8.2.1 (for measures 8.1 & 8.2)

There have been no significant updates since the last submitted report.

Commitment 9

Relevant Signatories commit to provide users with clear, comprehensible, comprehensive information about why they are seeing a political or issue ad.

We signed up to the following measures of this commitment

Measure 9.1 Measure 9.2

In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?

Yes

If yes, list these implementation measures here

As announced in July 2025, since October 6, 2025, Meta no longer allows political, electoral and social issue ads on our platforms in the EU, given the unworkable requirements and legal uncertainties introduced by the EU’s Transparency and Targeting of Political Advertising regulation.

Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?

N/A

If yes, which further implementation measures do you plan to put in place in the next 6 months?

N/A

Measure 9.1

Relevant Signatories will, seeking alignment with the European Commission's proposal for a Regulation on the transparency and targeting of political advertising, provide a simple means for users to access information about why they are seeing a particular political or issue ad.

Facebook

Measure 9.2

Relevant Signatories will explain in simple, plain language, the rationale and the tools used by the sponsors and providers of advertising services acting on behalf of sponsors (for instance: demographic, geographic, contextual, interest or behaviourally-based) to determine that a political or issue ad is displayed specifically to the user.

Facebook

QRE 9.2.1 (for measures 9.1 & 9.2)

Meta’s Why am I seeing this ad?” feature allows people to see how factors like basic demographic details, interests, and website visits contribute to the ads that are shown in their Feeds. 

In our baseline report, we also discussed how: 
  • We removed Detailed Targeting options that relate to topics people may perceive as sensitive, such as options referencing causes, organisations, or public figures that relate to health, race or ethnicity, political affiliation, religion, or sexual orientation.
  • Through the Ad Preferences tool, people are able to turn off all social issues, electoral or political ads from candidates or organisations that have the “Paid for by” political disclaimer on them. We also allow Facebook users to see how we decide which ads to show and how users can adjust their preferences to determine the ads users are shown.
  • Our FAQs section in the Ad Library also provides more information on how we decide to show ads.

Commitment 10

Relevant Signatories commit to maintain repositories of political or issue advertising and ensure their currentness, completeness, usability and quality, such that they contain all political and issue advertising served, along with the necessary information to comply with their legal obligations and with transparency commitments under this Code.

We signed up to the following measures of this commitment

Measure 10.1 Measure 10.2

In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?

Yes

If yes, list these implementation measures here

As announced in July 2025, since October 6, 2025, Meta no longer allows political, electoral and social issue ads on our platforms in the EU, given the unworkable requirements and legal uncertainties introduced by the EU’s Transparency and Targeting of Political Advertising regulation.

Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?

N/A

If yes, which further implementation measures do you plan to put in place in the next 6 months?

N/A

Measure 10.1

Relevant Signatories will set up and maintain dedicated searchable ad repositories containing accurate records (in as close to real time as possible, in particular during election periods) of all political and issue ads served, including the ads themselves. This should be accompanied by relevant information for each ad such as the identification of the sponsor; the dates the ad ran for; the total amount spent on the ad; the number of impressions delivered; the audience criteria used to determine recipients; the demographics and number of recipients who saw the ad; and the geographical areas the ad was seen in.

Facebook

Measure 10.2

The information in such ad repositories will be publicly available for at least 5 years.

Facebook

QRE 10.2.1 (for Measures 10.1 and 10.2)

Relevant Signatories will detail the availability, features, and updating cadence of their repositories to comply with Measures 10.1 and 10.2. Relevant Signatories will also provide quantitative information on the usage of the repositories, such as monthly usage.

As mentioned in our baseline report, the Ad Library provides advertising transparency by offering a comprehensive, searchable collection of all ads currently running from across Meta technologies. We currently store these ads in the library for 7 years. 

Commitment 11

Relevant Signatories commit to provide application programming interfaces (APIs) or other interfaces enabling users and researchers to perform customised searches within their ad repositories of political or issue advertising and to include a set of minimum functionalities as well as a set of minimum search criteria for the application of APIs or other interfaces.

We signed up to the following measures of this commitment

Measure 11.1 Measure 11.2 Measure 11.3 Measure 11.4

In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?

No

If yes, list these implementation measures here

As mentioned in our baseline report, our Ad Library application programming interface (“API”) allows users to perform custom keyword searches of ads stored in the Ad Library. Users can search data for all inactive ads about social issues, elections or politics. For people less familiar with the API solution, we provide a simpler research solution with our Ad Library report.

Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?

N/A

If yes, which further implementation measures do you plan to put in place in the next 6 months?

N/A

Measure 11.1

Relevant Signatories' APIs or other interfaces will provide a set of minimum functionalities and search criteria that enable users and researchers to perform customised searches for data in as close to real time as possible (in particular during elections) in standard formats, including for instance searches per advertiser or candidate, per geographic area or country, per language, per keyword, per election, or per other targeting criteria, to allow for research and monitoring.

Facebook

QRE 11.1.1 (for Measures 11.1-11.4)

Details extracted from JSON files.

As mentioned in our baseline report, the Ad Library API provides access to data about inactive ads about social issues, elections or politics from countries where the Ad Library is live, including European Union countries. 

The Ad Library API provides programmatic access to information about ads about politics or issues in the Library. Users can search data for all inactive ads about social issues, elections or politics. People are able to search for any term, name or Page in the Ad Library. In the EU, anyone with a Facebook account can complete these steps to access the API. 

Measure 11.2

The data Relevant Signatories make available via such APIs and other interfaces will be equivalent to or more detailed than that data made available through their ad repositories.

Facebook

Measure 11.3

Relevant Signatories will ensure wide access to and availability of APIs and other interfaces.

Facebook

Measure 11.4

Relevant Signatories will engage with researchers and update the functionalities of the APIs and other interfaces to meet researchers' reasonable needs where applicable.

Facebook

QRE 11.4.1

Relevant Signatories will report about their engagement with researchers, including to understand their experience with the functionalities of APIs, and the resulting improvements of the functionalities as the result of this engagement and of a discussion within the Task-force.

As of December 2025, we’ve made targeting information for over 49 million social issue, electoral, and political Facebook and Instagram ads globally available to academic researchers. More details on the original launch of this initiative are available in the baseline report. 

Commitment 13

Relevant Signatories agree to engage in ongoing monitoring and research to understand and respond to risks related to Disinformation in political or issue advertising.

We signed up to the following measures of this commitment

Measure 13.1 Measure 13.2 Measure 13.3

In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?

No

If yes, list these implementation measures here

There have been no significant updates since the last submitted report.

Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?

No

If yes, which further implementation measures do you plan to put in place in the next 6 months?

N/A

Measure 13.1

Relevant Signatories agree to work individually and together through the Task-force to identify novel and evolving disinformation risks in the uses of political or issue advertising and discuss options for addressing those risks.

Facebook 

QRE 13.1.1 (for Measures 13.1-13.3)

Through the Task-force, the Relevant Signatories will convene, at least annually, an appropriately resourced discussion around novel risks in political advertising to develop coordinated policy.

There have been no significant updates since the last submitted report.


Measure 13.2

Facebook

Measure 13.3

Facebook

Integrity of Services

Commitment 14

In order to limit impermissible manipulative behaviours and practices across their services, Relevant Signatories commit to put in place or further bolster policies to address both misinformation and disinformation across their services, and to agree on a cross-service understanding of manipulative behaviours, actors and practices not permitted on their services. Such behaviours and practices include: The creation and use of fake accounts, account takeovers and bot-driven amplification, Hack-and-leak operations, Impersonation, Malicious deep fakes, The purchase of fake engagements, Non-transparent paid messages or promotion by influencers, The creation and use of accounts that participate in coordinated inauthentic behaviour, User conduct aimed at artificially amplifying the reach or perceived public support for disinformation.

We signed up to the following measures of this commitment

Measure 14.1 Measure 14.2 Measure 14.3

In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?

Yes

If yes, list these implementation measures here

As mentioned in our baseline report, we continue to enforce and report publicly on our policies to tackle inauthentic behaviour. 

  • Fake accounts: In order to maintain a safe environment, we restrict or remove fake accounts that violate our Terms of Service. We expect the number of accounts we action to vary over time due to the unpredictable nature of adversarial account creation. We actioned 692M accounts against our fake accounts policy in Q3 2025 and 1.1B fake accounts in Q4 2025 on Facebook globally. 
  • Inauthentic behaviour: We continue to investigate and take down coordinated adversarial networks of accounts, Pages and Groups on Facebook that attempt to deceive Meta or our community or to evade enforcement under the Community Standards. In 2025. We updated our inauthentic behavior policy to simplify and refine our policy language and to help uninvolved authentic communities, Pages and Groups that are targeted, managed, or co-opted by CIB operations to remain on our services. . We also work to scale our enforcement by feeding the insights we learn from investigating these networks globally into automated detection systems to help us find bad actors engaged in these and similar violating behaviours, including networks that attempt to come back after we had taken them down.  

In July 2024, we stopped removing content solely on the basis of our manipulated video policy. We will continue to remove content if it violates our Community Standards, regardless of whether it is created by AI or not.

Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?

No

If yes, which further implementation measures do you plan to put in place in the next 6 months?

N/A

Measure 14.1

Relevant Signatories will adopt, reinforce and implement clear policies regarding impermissible manipulative behaviours and practices on their services, based on the latest evidence on the conducts and tactics, techniques and procedures (TTPs) employed by malicious actors, such as the AMITT Disinformation Tactics, Techniques and Procedures Framework.

Facebook

QRE 14.1.1

Relevant Signatories will list relevant policies and clarify how they relate to the threats mentioned above as well as to other Disinformation threats.

To clarify what we’ve included in our baseline report, depending on the context, the actor, and the activity, several TTPs can be combined and are covered by several of our policies. We have highlighted some examples below:

Inauthentic Behaviour -  Our Inauthentic Behaviour policy is targeted at addressing deceptive behaviours. In line with our commitment to authenticity, we do not allow people to misrepresent themselves on Facebook or use fake accounts.

CIB Policy - Our policy on Coordinated Inauthentic Behaviour (CIB) addresses covert influence operations (IO). Defined as “particularly sophisticated forms of Inauthentic Behavior where inauthentic accounts are central to the operation,” the policy informs how we find, identify and remove IO networks on our platforms.

CIB can include a variety of different TTPs depending on the actors, context, and operation. Having said that, we often see (1) creation of inauthentic accounts; (2) the use of fake / inauthentic reactions (e.g., likes, upvotes, comments); (3) the use of fake followers or subscribers; (4) the creation of inauthentic pages, groups, and domains; (5) inauthentic coordination of content creation or amplification; (6) account hijacking or impersonation; and (7) inauthentic coordination.  

We also remove millions of fake accounts every day under our policy on Account Integrity and Authentic Identity

Cybersecurity - Attempts to gather sensitive personal information or engage in unauthorised access by deceptive or invasive methods are harmful to the authentic, open and safe atmosphere that we want to foster. Therefore, we do not allow attempts to gather sensitive user information or engage in unauthorised access through the abuse of our platform, products, or services.

Spam - We work hard to limit the spread of spam because we do not want to allow content that is designed to deceive, or that attempts to mislead users, to increase viewership. We also aim to prevent people from abusing our platform, products or features to artificially increase viewership or distribute content en masse for commercial gain. This can be pertinent for several TTPs depending on the context including  (1) creation of inauthentic accounts (2) the use of fake / inauthentic reactions (e.g., likes, upvotes, comments), (3) the use of fake followers or subscribers (4) the creation of inauthentic Pages, groups, chat groups, fora, or domains and (5) the use of deceptive practices.

Branded Content Policies - Branded content may only be posted with the use of the branded content tool, and creators must use the branded content tool to tag the featured third-party product, brand, or business partner with their prior permission. Branded content may only be posted by Facebook Pages, Groups, and profiles with access to the branded content tool. This is pertinent to non-transparent promotional messages.

Privacy - We remove content that shares, offers or solicits personally identifiable information or other private information that could lead to physical or financial harm, including financial, residential, and medical information, as well as private information obtained from illegal sources. 

QRE 14.1.2

Signatories will report on their proactive efforts to detect impermissible content, behaviours, TTPs and practices relevant to this commitment.

As mentioned in our baseline report, our approach to Coordinated Inauthentic Behaviour (CIB) more broadly is grounded on behaviour-based enforcement. This means that we are looking for specific violating behaviours, rather than violating content (which is predicated on other specific violations of our Community Standards, such as misinformation and hate speech). Therefore, when CIB networks are taken down, it is based on their behaviour, not the content they posted.  

In addition to expert investigations against CIB, we also work to tackle inauthentic behaviour by fake accounts at scale. 

Pages and Groups that violate our CIB policy are removed. Automatically, as these accounts are taken down, posts published by these accounts go down as well. 

We monitor for efforts to re-establish a presence on Facebook by networks we previously removed. 

For a comprehensive overview of our approach, see here.

Measure 14.2

Relevant Signatories will keep a detailed, up-to-date list of their publicly available policies that clarifies behaviours and practices that are prohibited on their services and will outline in their reports how their respective policies and their implementation address the above set of TTPs, threats and harms as well as other relevant threats.

Facebook

QRE 14.2.1

Relevant Signatories will report on actions taken to implement the policies they list in their reports and covering the range of TTPs identified/employed, at the Member State level.

We report twice a year on enforcement actions taken under the two policies most relevant to this Commitment:

Our fake accounts policies:

  • In Q3 2025, we took action against 692M fake accounts. We estimate that fake accounts represented approximately 4% of our worldwide daily active people (DAP) on Facebook during Q3 2025. 
  • In Q4 2025, we took action against 1.1B fake accounts. We estimate that fake accounts represented approximately 5% of our worldwide daily active people (DAP) during  Q4 2025. 

Our coordinated inauthentic behaviour policies:

  • In the second half of 2025, we disrupted a coordinated inauthentic behavior network originating in and targeting Poland. We removed 55 Facebook accounts, 36 Pages, 23 Groups, and 1 Instagram account for violating our policy against Coordinated Inauthentic Behavior.

  • We disrupted a coordinated inauthentic behavior network originating in Belarus and targeting Polish audiences. Our internal investigation revealed links to Belarus and Russia, indicating a coordinated foreign influence campaign. We removed 4 Facebook accounts, 12 Pages, and 21 Instagram accounts for violating our policy against Coordinated Inauthentic Behavior.

SLI 14.2.1

Number of instances of identified TTPs and actions taken at the Member State level under policies addressing each of the TTPs as well as information on the type of content.

TTPs covered by this action, selected from the list at the top of this chapter: This action covers the following TTPs in the context of coordinated inauthentic behaviour:
     Use of fake / inauthentic reactions (e.g., likes, upvotes, comments)
     Use of fake followers or subscribers
     Creation of inauthentic pages, groups, chat groups, fora, or domains
     Inauthentic coordination of content creation or amplification
     Account hijacking or impersonation

Methodology of data measurement: coordinated inauthentic behaviour (CIB) covers particularly sophisticated forms of Inauthentic Behaviour where false identities are central to the operation and operators use adversarial methods to evade detection or appear authentic. When we investigate and remove these operations, we focus on behaviour rather than content — no matter who’s behind them, what they post or whether they’re foreign or domestic. We included below any network (1) originating in Europe or (2) targeting one or more European country (effectively or potentially), removed from 01/07/2025 to 31/12/2025. We categorised them based on their originating country in the table below.

Poland:

- Number of instances of identified TTPs: 55 Facebook accounts, 36 Pages, 23 Groups

- Number of actions taken by type: Removal of  55 Facebook accounts, 36 Pages, 23 Groups

Belarus:

- Number of instances of identified TTPs: 4 Facebook accounts, 12 Pages

- Number of actions taken by type: Removal of 4 Facebook accounts, 12 Pages

TTPs covered by this action, selected from the list at the top of this chapter: This action covers the following TTPs: 
  • Creation of inauthentic accounts or botnets (which may include automated, partially automated, or non-automated accounts)
  • Use of fake followers or subscribers
  • Creation of inauthentic pages, groups, chat groups, fora, or domains

Methodology of data measurement: Total number of accounts Facebook took action on for being fake accounts from 01/07/2025 to 31/12/2025 globally. It includes both accounts reported by users and accounts found proactively. More information here


Global Q3:

- Number of instances of identified TTPs: 692M accounts

- Number of actions taken by type: Removal of 692M  accounts

Global Q4: 

- Number of instances of identified TTPs:  1.1B accounts

- Number of actions taken by type: Removal of 1.1B accounts

Country TTP OR ACTION1 - Nr of instances TTP OR ACTION1 - Nr of actions TTP OR ACTION2 - Nr of instances TTP OR ACTION2 - Nr of actions TTP OR ACTION3 - Nr of instances TTP OR ACTION3 - Nr of actions TTP OR ACTION4 - Nr of instances TTP OR ACTION4 - Nr of actions TTP OR ACTION5 - Nr of instances TTP OR ACTION5 - Nr of actions TTP OR ACTION6 - Nr of instances TTP OR ACTION6 - Nr of actions TTP OR ACTION7 - Nr of instances TTP OR ACTION7 - Nr of actions TTP OR ACTION8 - Nr of instances TTP OR ACTION8 - Nr of actions TTP OR ACTION9 - Nr of instances TTP OR ACTION9 - Nr of actions TTP OR ACTION10 - Nr of instances TTP OR ACTION10 - Nr of actions TTP OR ACTION11 - Nr of instances TTP OR ACTION11 - Nr of actions TTP OR ACTION12 - Nr of instances TTP OR ACTION12 - Nr of actions
Austria 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Belgium 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Bulgaria 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Croatia 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Cyprus 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Czech Republic 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Denmark 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Estonia 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Finland 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
France 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Germany 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Greece 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Hungary 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Iceland 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Ireland 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Italy 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Latvia 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Lithuania 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Luxembourg 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Malta 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Netherlands 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Poland 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Portugal 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Romania 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Slovakia 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Slovenia 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Spain 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Sweden 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Liechtenstein 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Norway 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Belarus
Global Q3
Global Q4

SLI 14.2.2

Views/impressions of and interaction/engagement at the Member State level (e.g. likes, shares, comments), related to each identified TTP, before and after action was taken.

TTPs covered by this action, selected from the list at the top of this chapter: This action covers the following TTPs in the context of coordinated inauthentic behaviour:
     Use of fake / inauthentic reactions (e.g., likes, upvotes, comments)
     Use of fake followers or subscribers
     Creation of inauthentic pages, groups, chat groups, fora, or domains
     Inauthentic coordination of content creation or amplification
     Account hijacking or impersonation

Methodology of data measurement: coordinated inauthentic behaviour (CIB) covers particularly sophisticated forms of Inauthentic Behaviour where false identities are central to the operation and operators use adversarial methods to evade detection or appear authentic. When we investigate and remove these operations, we focus on behaviour rather than content — no matter who’s behind them, what they post or whether they’re foreign or domestic. We included below any network (1) originating in Europe or (2) targeting one or more European country (effectively or potentially), removed from 01/07/2025 to 31/12/2025. We categorised them based on their originating country in the table below.


Poland: 

- Views/Impressions before action: 

- Interaction/Engagement before action: About 49,000 accounts followed one or more of these Pages, about 1,100 accounts followed one or more of these Groups.

- Views/impressions after action: 0 (deleted)

- Interaction/Engagement after action: 0 (deleted)


Belarus: 

- Views/Impressions before action: 

- Interaction/Engagement before action: About 200 accounts followed one or more of these Pages

- Views/impressions after action: 0 (deleted)

- Interaction/Engagement after action: 0 (deleted)

TTPs covered by this action, selected from the list at the top of this chapter: This action covers the following TTPs: 
  • Creation of inauthentic accounts or botnets (which may include automated, partially automated, or non-automated accounts)
  • Use of fake followers or subscribers
  • Creation of inauthentic pages, groups, chat groups, fora, or domains

Methodology of data measurement: Total number of accounts Facebook took action on for being fake accounts from 01/07/2025 to 31/12/2025 globally. It includes both accounts reported by users and accounts found proactively. More information here


Global Q3: 

- Views/Impressions before action: 

- Interaction/Engagement before action: 

- Views/impressions after action: 0 (deleted)

- Interaction/Engagement after action: 0 (deleted)

Global Q4: 

- Views/Impressions before action: 

- Interaction/Engagement before action: 

- Views/impressions after action: 0 (deleted)

- Interaction/Engagement after action: 0 (deleted)

Country Views/ impressions before action  Interaction/ engagement before action  Views/ impressions after action  Interaction/ engagement after action  TTP OR ACTION2 - Views before action TTP OR ACTION2 - Engagement before action TTP OR ACTION2 - Views after action TTP OR ACTION2 - Engagement after action TTP OR ACTION3 - Views before action TTP OR ACTION3 - Engagement before action TTP OR ACTION3 - Views after action TTP OR ACTION3 - Engagement after action TTP OR ACTION4 - Views before action TTP OR ACTION4 - Engagement before action TTP OR ACTION4 - Views after action TTP OR ACTION4 - Engagement after action TTP OR ACTION5 - Views before action TTP OR ACTION5 - Engagement before action TTP OR ACTION5 - Views after action TTP OR ACTION5 - Engagement after action TTP OR ACTION6 - Views before action TTP OR ACTION6 - Engagement before action TTP OR ACTION6 - Views after action TTP OR ACTION6 - Engagement after action TTP OR ACTION7 - Views before action TTP OR ACTION7 - Engagement before action TTP OR ACTION7 - Views after action TTP OR ACTION7 - Engagement after action TTP OR ACTION8 - Views before action TTP OR ACTION8 - Engagement before action TTP OR ACTION8 - Views after action TTP OR ACTION8 - Engagement after action TTP OR ACTION9 - Views before action TTP OR ACTION9 - Engagement before action TTP OR ACTION9 - Views after action TTP OR ACTION9 - Engagement after action TTP OR ACTION10 - Views before action TTP OR ACTION10 - Engagement before action TTP OR ACTION10 - Views after action TTP OR ACTION10 - Engagement after action TTP OR ACTION11 - Views before action TTP OR ACTION11 - Engagement before action TTP OR ACTION11 - Views after action TTP OR ACTION11 - Engagement after action TTP OR ACTION12 - Views before action TTP OR ACTION12 - Engagement before action TTP OR ACTION12 - Views after action TTP OR ACTION12 - Engagement after action
Austria 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Belgium 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Bulgaria 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Croatia 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Cyprus 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Czech Republic 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Denmark 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Estonia 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Finland 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
France 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Germany 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Greece 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Hungary 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Ireland 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Italy 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Latvia 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Lithuania 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Luxembourg 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Malta 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Netherlands 15 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Poland 0 About 49,000 accounts followed one or more of these Pages, about 1,100 accounts followed one or more of these Groups. 0 (deleted) 0 (deleted) 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Portugal 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Romania 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Slovakia 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Slovenia 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Spain 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Sweden 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Iceland 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Liechtenstein 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Norway 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Belarus 0 About 200 accounts followed one or more of these Pages 0 (deleted) 0 (deleted) 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Global Q3 0 0 0 (deleted) 0 (deleted) 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Global Q4 0 0 0 (deleted) 0 (deleted) 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0

SLI 14.2.3

Metrics to estimate the penetration and impact that e.g. Fake/Inauthentic accounts have on genuine users and report at the Member State level (including trends on audiences targeted; narratives used etc.).

TTPs covered by this action, selected from the list at the top of this chapter: This action covers the following TTPs in the context of coordinated inauthentic behaviour:
     Use of fake / inauthentic reactions (e.g., likes, upvotes, comments)
     Use of fake followers or subscribers
     Creation of inauthentic pages, groups, chat groups, fora, or domains
     Inauthentic coordination of content creation or amplification
     Account hijacking or impersonation

Methodology of data measurement: coordinated inauthentic behaviour (CIB) covers particularly sophisticated forms of Inauthentic Behaviour where false identities are central to the operation and operators use adversarial methods to evade detection or appear authentic. When we investigate and remove these operations, we focus on behaviour rather than content — no matter who’s behind them, what they post or whether they’re foreign or domestic. We included below any network (1) originating in Europe or (2) targeting one or more European country (effectively or potentially), removed from 01/07/2025 to 31/12/2025. We categorised them based on their originating country in the table below.

Poland: 

- Penetration and impact on genuine users : 

- Trends on targeted audiences : We observed that network operators consistently amplified narratives critical of Warsaw Mayor Rafal Trzaskowski and the current Polish government while promoting content favorable to the Polish Law and Justice (PiS) Party. The network employed sophisticated persona development tactics, creating fake accounts with carefully crafted political identities spanning the ideological spectrum, including both left-wing and right-wing personas as well as accounts focused on historical interests.

- Trends on narratives used :

Belarus:

- Penetration and impact on genuine users : 

- Trends on targeted audiences : We observed that network operators strategically disseminated messaging focused on Poland's immigration policies and the country's relationships with the European Union and Ukraine.

- Trends on narratives used :

TTPs covered by this action, selected from the list at the top of this chapter: This action covers the following TTPs: 
  • Creation of inauthentic accounts or botnets (which may include automated, partially automated, or non-automated accounts)
  • Use of fake followers or subscribers
  • Creation of inauthentic pages, groups, chat groups, fora, or domains

Methodology of data measurement: Total number of accounts Facebook took action on for being fake accounts from 01/07/2025 to 31/12/2025 globally. It includes both accounts reported by users and accounts found proactively. More information here

Global Q3:

- Penetration and impact on genuine users : 

- Trends on targeted audiences : 

- Trends on narratives used :

Global Q4:

- Penetration and impact on genuine users : 

- Trends on targeted audiences : 

- Trends on narratives used :

Country TTP OR ACTION1 - Penetration and impact on genuine users TTP OR ACTION1 - Trends on targeted audiences TTP OR ACTION1 - Trends on narratives used TTP OR ACTION2 - Penetration and impact on genuine users TTP OR ACTION2 - Trends on targeted audiences TTP OR ACTION2 - Trends on narratives used TTP OR ACTION3 - Penetration and impact on genuine users TTP OR ACTION3 - Trends on targeted audiences TTP OR ACTION3 - Trends on narratives used TTP OR ACTION4 - Penetration and impact on genuine users TTP OR ACTION4 - Trends on targeted audiences TTP OR ACTION4 - Trends on narratives used TTP OR ACTION5 - Penetration and impact on genuine users TTP OR ACTION5 - Trends on targeted audiences TTP OR ACTION5 - Trends on narratives used TTP OR ACTION6 - Penetration and impact on genuine users TTP OR ACTION6 - Trends on targeted audiences TTP OR ACTION6 - Trends on narratives used TTP OR ACTION7 - Penetration and impact on genuine users TTP OR ACTION7 - Trends on targeted audiences TTP OR ACTION7 - Trends on narratives used TTP OR ACTION8 - Penetration and impact on genuine users TTP OR ACTION8 - Trends on targeted audiences TTP OR ACTION8 - Trends on narratives used TTP OR ACTION9 - Penetration and impact on genuine users TTP OR ACTION9 - Trends on targeted audiences TTP OR ACTION9 - Trends on narratives used TTP OR ACTION10 - Penetration and impact on genuine users TTP OR ACTION10 - Trends on targeted audiences TTP OR ACTION10 - Trends on narratives used TTP OR ACTION11 - Penetration and impact on genuine users TTP OR ACTION11 - Trends on targeted audiences TTP OR ACTION11 - Trends on narratives used TTP OR ACTION12 - Penetration and impact on genuine users TTP OR ACTION12 - Trends on targeted audiences TTP OR ACTION12 - Trends on narratives used
Austria 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Belgium 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Bulgaria 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Croatia 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Cyprus 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Czech Republic 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Denmark 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Estonia 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Finland 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
France 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Germany 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Greece 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Hungary 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Ireland 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Italy 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Latvia 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Lithuania 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Luxembourg 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Malta 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Netherlands 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Poland 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Portugal 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Romania 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Slovakia 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Slovenia 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Spain 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Sweden 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Iceland 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Liechtenstein 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Norway 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Belarus
Global Q3
Global Q4

SLI 14.2.4

Estimation, at the Member State level, of TTPs related content, views/impressions and interaction/engagement with such content as a percentage of the total content, views/impressions and interaction/engagement on relevant signatories' service.

TTPs covered by this action, selected from the list at the top of this chapter: This action covers the following TTPs in the context of coordinated inauthentic behaviour:
     Use of fake / inauthentic reactions (e.g., likes, upvotes, comments)
     Use of fake followers or subscribers
     Creation of inauthentic pages, groups, chat groups, fora, or domains
     Inauthentic coordination of content creation or amplification
     Account hijacking or impersonation

Methodology of data measurement: coordinated inauthentic behaviour (CIB) covers particularly sophisticated forms of Inauthentic Behaviour where false identities are central to the operation and operators use adversarial methods to evade detection or appear authentic. When we investigate and remove these operations, we focus on behaviour rather than content — no matter who’s behind them, what they post or whether they’re foreign or domestic. We included below any network (1) originating in Europe or (2) targeting one or more European country (effectively or potentially), removed from 01/07/2025 to 31/12/2025. We categorised them based on their originating country in the table below.

Poland:

- TTPs related content in relation to overall content on the service :

- Views/ impressions of TTP related content (in relation to overall views/impressions on the service) :

- Interaction/ engagement with TTP related content (in relation to overall interaction/engagement on the service) :

Belarus:

- TTPs related content in relation to overall content on the service :

- Views/ impressions of TTP related content (in relation to overall views/impressions on the service) :

- Interaction/ engagement with TTP related content (in relation to overall interaction/engagement on the service):

TTPs covered by this action, selected from the list at the top of this chapter: This action covers the following TTPs: 
  • Creation of inauthentic accounts or botnets (which may include automated, partially automated, or non-automated accounts)
  • Use of fake followers or subscribers
  • Creation of inauthentic pages, groups, chat groups, fora, or domains

Methodology of data measurement: Total number of accounts Facebook took action on for being fake accounts from 01/07/2025 to 31/12/2025 globally. It includes both accounts reported by users and accounts found proactively. More information here

Global Q3:

- TTPs related content in relation to overall content on the service :

- Views/ impressions of TTP related content (in relation to overall views/impressions on the service) :

- Interaction/ engagement with TTP related content (in relation to overall interaction/engagement on the service) :

Global Q4: 

- TTPs related content in relation to overall content on the service :

- Views/ impressions of TTP related content (in relation to overall views/impressions on the service) :

- Interaction/ engagement with TTP related content (in relation to overall interaction/engagement on the service) :

Country TTP OR ACTION1 - TTPs related content in relation to overall content TTP OR ACTION1 - Views of TTP content TTP OR ACTION1 - Engagement with TTP content TTP OR ACTION2 - TTPs related content in relation to overall content TTP OR ACTION2 - Views of TTP content TTP OR ACTION2 - Engagement with TTP content TTP OR ACTION3 - TTPs related content in relation to overall content TTP OR ACTION3 - Views of TTP content TTP OR ACTION3 - Engagement with TTP content TTP OR ACTION4 - TTPs related content in relation to overall content TTP OR ACTION4 - Views of TTP content TTP OR ACTION4 - Engagement with TTP content TTP OR ACTION5 - TTPs related content in relation to overall content TTP OR ACTION5 - Views of TTP content TTP OR ACTION5 - Engagement with TTP content TTP OR ACTION6 - TTPs related content in relation to overall content TTP OR ACTION6 - Views of TTP content TTP OR ACTION6 - Engagement with TTP content TTP OR ACTION7 - TTPs related content in relation to overall content TTP OR ACTION7 - Views of TTP content TTP OR ACTION7 - Engagement with TTP content TTP OR ACTION8 - TTPs related content in relation to overall content TTP OR ACTION8 - Views of TTP content TTP OR ACTION8 - Engagement with TTP content TTP OR ACTION9 - TTPs related content in relation to overall content TTP OR ACTION9 - Views of TTP content TTP OR ACTION9 - Engagement with TTP content TTP OR ACTION10 - TTPs related content in relation to overall content TTP OR ACTION10 - Views of TTP content TTP OR ACTION10 - Engagement with TTP content TTP OR ACTION11 - TTPs related content in relation to overall content TTP OR ACTION11 - Views of TTP content TTP OR ACTION11 - Engagement with TTP content TTP OR ACTION12 - TTPs related content in relation to overall content TTP OR ACTION12 - Views of TTP content TTP OR ACTION12 - Engagement with TTP content
Austria 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Belgium 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Bulgaria 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Croatia 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Cyprus 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Czech Republic 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Denmark 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Estonia 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Finland 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
France 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Germany 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Greece 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Hungary 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Ireland 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Italy 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Latvia 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Lithuania 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Luxembourg 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Malta 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Netherlands 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Poland 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Portugal 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Romania 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Slovakia 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Slovenia 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Spain 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Sweden 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Iceland 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Liechtenstein 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Norway 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
Belarus
Global Q3
Global Q4

Measure 14.3

Relevant Signatories will convene via the Permanent Task-force to agree upon and publish a list and terminology of TTPs employed by malicious actors, which should be updated on an annual basis.

Facebook

QRE 14.3.1

Signatories will report on the list of TTPs agreed in the Permanent Task-force within 6 months of the signing of the Code and will update this list at least every year. They will also report about the common baseline elements, objectives and benchmarks for the policies and measures.

We continue to engage with this working group now that the list of TTPs has been reached (as reported in our benchmark report), notably to discuss how we report for those TTPs under the SLIs 14.2.1-14.2.4 above. 

Commitment 15

Relevant Signatories that develop or operate AI systems and that disseminate AI-generated and manipulated content through their services (e.g. deepfakes) commit to take into consideration the transparency obligations and the list of manipulative practices prohibited under the proposal for Artificial Intelligence Act.

We signed up to the following measures of this commitment

Measure 15.1 Measure 15.2

In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?

No

If yes, list these implementation measures here

We recognize that the widespread availability and adoption of generative AI tools may have implications for how we identify and address disinformation on our platforms. We also acknowledge that, under the AIA, certain AI techniques are considered purposefully deceptive or manipulative if they impact people's behavior and decision-making abilities and are reasonably likely to cause significant harm.

We want people to know when they see posts that have been made with AI. In 2024, we announced a new approach for labeling AI-generated organic content. An important part of this approach relies on industry standard indicators that other companies include in content created using their tools, which help us assess whether something is created using AI.

We also rolled out a change to the “AI info” labels on our platforms so they better reflect the extent of AI used in content. Our intent has always been to help people know when they see content that was made with AI, and we’ve continued to work with companies across the industry to improve our labeling process so that labels on our platforms are more in line with peoples’ expectations.

For organic content that we detect was only modified or edited by AI tools, we moved the “AI info” label to the post’s menu. We still display the “AI info” label for content we detect was generated by an AI tool and share whether the content is labeled because of industry-shared signals or because someone self-disclosed. 

We place “AI Info” labels on ad creative images and videos using a risk-based framework. When an image or video is created or significantly edited with our generative AI creative features in our advertiser marketing tools, a label will appear in the three-dot menu or next to the “Sponsored” label. When these tools result in the inclusion of an AI-generated photorealistic human, the label will appear next to the Sponsored label (not behind the three-dot menu).  

We will continue to evolve our approach to labeling AI-generated content in partnership with experts, advertisers, policy stakeholders and industry partners as people’s expectations and the technologies change.

Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?

Yes

If yes, which further implementation measures do you plan to put in place in the next 6 months?

We remain committed to improving our ads transparency tools and searching for new ways to provide people with better understanding of how we use data and technology to show them ads. Providing transparency around our home-grown generative AI tools is a first step on our ads generative AI transparency journey.

Measure 15.1

Relevant signatories will establish or confirm their policies in place for countering prohibited manipulative practices for AI systems that generate or manipulate content, such as warning users and proactively detect such content.

Facebook

QRE 15.1.1

In line with EU and national legislation, Relevant Signatories will report on their policies in place for countering prohibited manipulative practices for AI systems that generate or manipulate content.

We address potential abuses from AI-generated content in two primary ways: (1) we remove content that violates our Community Standards regardless of how it was generated; and (2) our third-party fact-checkers can rate content that is false and misleading regardless of how it was generated. 

In February 2024, Meta’s Oversight Board provided feedback regarding our approach to manipulated media, arguing that we unnecessarily risk restricting freedom of expression when we remove manipulated media that does not otherwise violate our Community Standards. It recommended a “less restrictive” approach to manipulated media, such as labels with context. 

We agree that providing transparency and additional context is now the better way to address this content. In May 2024 we began labelling AI generated or edited content (based on industry aligned standards on identifying AI as well as through users self declaring AI influenced content) with the label ‘Made with AI’. While we work with companies across the industry to improve the process so our labelling approach better matches our intent, we’ve updated the “Made with AI” label to “AI info” across our apps, which people can click for more information. These labels cover a broader range of content in addition to the manipulated content that the Oversight Board also recommended labelling in their feedback. 

If we determine that digitally-created or altered images, video or audio create a particularly high risk of materially deceiving the public on a matter of importance, we may add a more prominent label so people have more information and context.

In H2 2024, we rolled out a change to the “AI info” labels on our platforms so they better reflect the extent of AI used in content. Our intent has always been to help people know when they see content that was made with AI, and we’ve continued to work with companies across the industry to improve our labeling process so that labels on our platforms are more in line with peoples’ expectations.

For content that we detect was only modified or edited by AI tools, we are moving the “AI info” label to the post’s menu. We will still display the “AI info” label for content we detect was generated by an AI tool and share whether the content is labeled because of industry-shared signals or because someone self-disclosed.

Measure 15.2

Relevant Signatories will establish or confirm their policies in place to ensure that the algorithms used for detection, moderation and sanctioning of impermissible conduct and content on their services are trustworthy, respect the rights of end-users and do not constitute prohibited manipulative practices impermissibly distorting their behaviour in line with Union and Member States legislation.

Facebook

QRE 15.2.1

Relevant Signatories will report on their policies and actions to ensure that the algorithms used for detection, moderation and sanctioning of impermissible conduct and content on their services are trustworthy, respect the rights of end-users and do not constitute prohibited manipulative practices in line with Union and Member States legislation.

Meta commits to continue investing in Responsible AI to address the hard questions around issues such as privacy, fairness, accountability, and transparency.
  • We display the “AI info” label for content we detect was generated by an AI tool and share whether the content is labeled because of industry-shared signals or because someone self-disclosed.

Commitment 16

Relevant Signatories commit to operate channels of exchange between their relevant teams in order to proactively share information about cross-platform influence operations, foreign interference in information space and relevant incidents that emerge on their respective services, with the aim of preventing dissemination and resurgence on other services, in full compliance with privacy legislation and with due consideration for security and human rights risks.

We signed up to the following measures of this commitment

Measure 16.1 Measure 16.2

In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?

No

If yes, list these implementation measures here

There have been no significant updates since the last submitted report.

Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?

No

If yes, which further implementation measures do you plan to put in place in the next 6 months?

As mentioned in our baseline report, our policies are based on years of experience and expertise in safety combined with external input from experts around the world. We are continuously working to protect the integrity of our platforms and adjusting our policies, tools, and processes to combat disinformation. 

Measure 16.1

Relevant Signatories will share relevant information about cross-platform information manipulation, foreign interference in information space and incidents that emerge on their respective services for instance via a dedicated sub-group of the permanent Task-force or via existing fora for exchanging such information.

Facebook

QRE 16.1.1

Relevant Signatories will disclose the fora they use for information sharing as well as information about learnings derived from this sharing.

As mentioned in our baseline report, a key part of our strategy to prevent interference is working with government authorities, law enforcement, security experts, civil society and other tech companies to stop emerging threats by establishing a direct line of communication, sharing knowledge and identifying opportunities for collaboration. 

In December 2025, we shared our Adversarial Threat Report with information on threat research into new covert influence operations that we took down. We detected and removed these campaigns before they were able to build authentic audiences on our apps. 

Poland
We disrupted a coordinated inauthentic behavior network originating in and targeting Poland. We actioned 55 Facebook accounts, 36 Pages, 23 Groups for violating our policy against Coordinated Inauthentic Behavior. About 49,000 accounts followed one or more of these Pages, about 1,100 accounts followed one or more of these Groups. The network did not engage in paid advertising, instead relying on organic content amplification strategies to reach target audiences.
Our investigation found direct links to an individual based in Poland, indicating a domestic operation seeking to influence local political conversations. We found this network following an internal investigation that identified sophisticated deceptive tactics designed to manipulate
domestic political discourse.

Belarus
We disrupted a coordinated inauthentic behavior network originating in Belarus and targeting Polish audiences. Our internal investigation revealed links to Belarus and Russia, indicating a coordinated foreign influence campaign. We removed 4 Facebook accounts, 12 Pages for violating our policy against Coordinated Inauthentic Behavior. About 200 accounts followed one or more of these Pages. Network operators had around $1800 in spending for ads on Facebook and Instagram, paid for mostly in Polish zlotys and US Dollars, to amplify their content and expand their reach with targeted audiences


SLI 16.1.1

Number of actions taken as a result of the collaboration and information sharing between signatories. Where they have such information, they will specify which Member States that were affected (including information about the content being detected and acted upon due to this collaboration).

N/A

Measure 16.2

Relevant Signatories will pay specific attention to and share information on the tactical migration of known actors of misinformation, disinformation and information manipulation across different platforms as a way to circumvent moderation policies, engage different audiences or coordinate action on platforms with less scrutiny and policy bandwidth.

Facebook

QRE 16.2.1

As a result of the collaboration and information sharing between them, Relevant Signatories will share qualitative examples and case studies of migration tactics employed and advertised by such actors on their platforms as observed by their moderation team and/or external partners from Academia or fact-checking organisations engaged in such monitoring.

We publish our Adversarial Threat Reports to share notable trends and investigations to help inform our community’s understanding of the evolving security threats we see. 

Empowering Users

Commitment 17

In light of the European Commission's initiatives in the area of media literacy, including the new Digital Education Action Plan, Relevant Signatories commit to continue and strengthen their efforts in the area of media literacy and critical thinking, also with the aim to include vulnerable groups.

We signed up to the following measures of this commitment

Measure 17.1 Measure 17.2 Measure 17.3

In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?

Yes

If yes, list these implementation measures here

As mentioned in our baseline report, the key part of our approach to combat misinformation is providing tools and products that will contribute to a more resilient digital society, where people are able to critically evaluate information, make informed decisions about the content they see, and self-correct. Below are some examples of that work relevant to the European Union. 

  • Meta published its first Media Literacy Annual Plan on 21 July 2025, which set out its current approach to media literacy and the products and features we make available to users of Facebook and Instagram. 
  • In 2025, Meta launched a campaign that ran in Ireland, France, Spain, Italy and the Netherlands which aimed to increase awareness of new tools available on Instagram to protect Youth well-being.  These tools included private accounts, additional messaging and sensitive content restrictions, time limit reminders and sleep mode.
  • As part of our global anti-scam awareness campaign to protect people online, we share relevant product tools across Facebook. Additionally, we released new research on romance scams occurring across the internet, along with updates on our enforcement actions targeting scammers who impersonate military personnel and other individuals.
  • In 2025, Meta rolled out a youth-focused campaign across eight EU countries—France, Italy, Belgium, Denmark, Germany, Spain, Ireland, and Greece—running from late September through late November, to highlight support for parental approval for teens accessing online services.

Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?

No

If yes, which further implementation measures do you plan to put in place in the next 6 months?

N/A

Measure 17.1

Relevant Signatories will design and implement or continue to maintain tools to improve media literacy and critical thinking, for instance by empowering users with context on the content visible on services or with guidance on how to evaluate online content.

Facebook

QRE 17.1.1

Relevant Signatories will outline the tools they develop or maintain that are relevant to this commitment and report on their deployment in each Member State.

As mentioned in our baseline report, we have developed over the years a series of tools and resources - such as online tutorials, lesson plans for educators, tips for spotting false news, and awareness-raising ad campaigns - to educate and equip people with the necessary skills for navigating the digital world. 

A key pillar of our strategy is to inform our users by providing people with specific and relevant context when they come across a flagged post, we can help them be more informed about what they see and read. Here are some ways we provide context on relevant pieces of content that may be sensitive or misleading:
  • Warning screens on sensitive content on Facebook: 
    • People value the ability to discuss important and often difficult issues online, but they also have different sensitivities to certain kinds of content. Therefore, we include a warning screen over potentially sensitive content on Facebook, such as:
      • Violent or graphic imagery.
      • Posts that contain descriptions of bullying or harassment, if shared to raise awareness.
      • Some forms of nudity.
      • Posts related to suicide or suicide attempts.
  • Verified badges on Facebook: 
    • Our goal is to help people feel confident about the content and accounts that they interact with. 
    • To combat impersonations and help people avoid scammers that pretend to be high-profile people, Meta provides verified badges on Pages and profiles that indicate a verified account. This means that we've confirmed the authentic presence of the public figure, celebrity or global brand that the account represents.
  • Notification screens on outdated articles on the Facebook app: 
    • Our goal is to make it easier for people to identify content that's timely, reliable and most valuable to them.
    • To give people more context about a news article before they share it on Facebook, Meta includes a notification screen if the article is more than 90 days old. After which, we allow people to continue sharing it if they desire. This notification helps people understand how old a given news article is and its source.
    • To ensure that we don't slow the spread of credible information, especially in the health space, content posted by government health authorities and recognised global health organisations does not have this notification screen.

SLI 17.1.1

Relevant Signatories will report, at the Member State level, on metrics pertinent to assessing the effects of the tools described in the qualitative reporting element for Measure 17.1, which will include: the total count of impressions of the tool; and information on the interactions/engagement with the tool.

We were not able to deliver this SLI  for this report.

Country Total count of the tool’s impressions Interactions/ engagement with the tool Other relevant metrics
Austria 0 0 0
Belgium 0 0 0
Bulgaria 0 0 0
Croatia 0 0 0
Cyprus 0 0 0
Czech Republic 0 0 0
Denmark 0 0 0
Estonia 0 0 0
Finland 0 0 0
France 0 0 0
Germany 0 0 0
Greece 0 0 0
Hungary 0 0 0
Ireland 0 0 0
Italy 0 0 0
Latvia 0 0 0
Lithuania 0 0 0
Luxembourg 0 0 0
Malta 0 0 0
Netherlands 0 0 0
Poland 0 0 0
Portugal 0 0 0
Romania 0 0 0
Slovakia 0 0 0
Slovenia 0 0 0
Spain 0 0 0
Sweden 0 0 0
Iceland 0 0 0
Liechtenstein 0 0 0
Norway 0 0 0

Measure 17.2

Relevant Signatories will develop, promote and/or support or continue to run activities to improve media literacy and critical thinking such as campaigns to raise awareness about Disinformation, as well as the TTPs that are being used by malicious actors, among the general public across the European Union, also considering the involvement of vulnerable communities.

Facebook

QRE 17.2.1

Relevant Signatories will describe the activities they launch or support and the Member States they target and reach. Relevant signatories will further report on actions taken to promote the campaigns to their user base per Member States targeted.

National Elections:
We proactively point users to reliable information on the electoral process through in-app ‘Election Day Information’. These are notices at the top of feed on Facebook, reminding people of the day they can vote and re-directing them to national authoritative sources on how and where to vote. 
For more information, please refer to the Elections chapter.

SLI 17.2.1

Relevant Signatories report on number of media literacy and awareness raising activities organised and or participated in and will share quantitative information pertinent to show the effects of the campaigns they build or support at the Member State level.

Please refer to the National Elections chapter for election-related statistics.


Country Nr of media literacy/ awareness raising activities organised/ participated in Reach of campaigns Nr of participants Nr of interactions with online assets Nr of participants (etc)
Austria 0 0 0 0 0
Belgium 0 0 0 0 0
Bulgaria 0 0 0 0 0
Croatia 0 0 0 0 0
Cyprus 0 0 0 0 0
Czech Republic 0 0 0 0 0
Denmark 0 0 0 0 0
Estonia 0 0 0 0 0
Finland 0 0 0 0 0
France 0 0 0 0 0
Germany 0 0 0 0 0
Greece 0 0 0 0 0
Hungary 0 0 0 0 0
Ireland 0 0 0 0 0
Italy 0 0 0 0 0
Latvia 0 0 0 0 0
Lithuania 0 0 0 0 0
Luxembourg 0 0 0 0 0
Malta 0 0 0 0 0
Netherlands 0 0 0 0 0
Poland 0 0 0 0 0
Portugal 0 0 0 0 0
Romania 0 0 0 0 0
Slovakia 0 0 0 0 0
Slovenia 0 0 0 0 0
Spain 0 0 0 0 0
Sweden 0 0 0 0 0
Iceland 0 0 0 0 0
Liechtenstein 0 0 0 0 0
Norway 0 0 0 0 0

Measure 17.3

For both of the above Measures, and in order to build on the expertise of media literacy experts in the design, implementation, and impact measurement of tools, relevant Signatories will partner or consult with media literacy experts in the EU, including for instance the Commission's Media Literacy Expert Group, ERGA's Media Literacy Action Group, EDMO, its country-specific branches, or relevant Member State universities or organisations that have relevant expertise.

Facebook

QRE 17.3.1

Relevant Signatories will describe how they involved and partnered with media literacy experts for the purposes of all Measures in this Commitment.

As mentioned in our baseline report, Meta, working in partnership with experts, educators, civic society and governments around the world is central to our digital citizenship efforts. Our partners bring valuable subject matter expertise and are also important channels for distributing these tools and resources to a broader audience. Partners we work with include various government bodies (such as ministries of education and media regulators), our global network of third-party fact-checkers, parent-teacher associations, the European Association for Viewers Interests (EAVI), the UNESCO Institute for Information Technologies in Education (UNESCO IITE), Yale University, Harvard University, Micro:bit Educational Foundation, and many more.

Meta also belongs to the Steering Committee of the EU Digital Citizenship working group, launched in December 2020 to contribute multidisciplinary expertise from civil society and industry to the current EU debate on digital citizenship.

Commitment 18

Relevant Signatories commit to minimise the risks of viral propagation of Disinformation by adopting safe design practices as they develop their systems, policies, and features.

We signed up to the following measures of this commitment

Measure 18.2 Measure 18.3

In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?

No

If yes, list these implementation measures here

There have been no significant updates since the last submitted report.

Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?

No

If yes, which further implementation measures do you plan to put in place in the next 6 months?

N/A

Measure 18.2

Relevant Signatories will develop and enforce publicly documented, proportionate policies to limit the spread of harmful false or misleading information (as depends on the service, such as prohibiting, downranking, or not recommending harmful false or misleading information, adapted to the severity of the impacts and with due regard to freedom of expression and information); and take action on webpages or actors that persistently violate these policies.

Facebook

QRE 18.2.1

Relevant Signatories will report on the policies or terms of service that are relevant to Measure 18.2 and on their approach towards persistent violations of these policies.

As mentioned in our baseline report, our policies and approach to tackle misinformation are published in our Transparency Centre: 

These include specific actions taken against actors that repeatedly share misinformation. We take action against Pages, groups, accounts and domains that repeatedly share or publish content that is rated False or Altered, near-identical to what fact-checkers have debunked as False or Altered, and content we enforce against under our policy on vaccine misinformation. If Pages, groups, accounts or websites repeatedly share such content they will see their distribution reduced. 

Our penalty system to restrict accounts that violate our Community Standards on the platform can be found here. For most violations, the user’s first strike will result in a warning with no further restrictions. If Meta removes additional posts that go against the Community Standards in the future, we'll apply additional strikes to the account, and the user may lose access to some features for longer periods of time.

These restrictions generally only apply to Facebook accounts, but they may also be extended to Pages that represent an individual, such as a celebrity or political figure. (Note that while we count strikes on both Facebook and Instagram, these restrictions only apply to Facebook accounts).

If content that users have posted goes against our more severe policies, such as our policy on dangerous individuals and organisations or adult sexual exploitation, the user may receive additional, longer restrictions from certain features.

For most violations, if the user continues to post content that goes against the Community Standards after repeated warnings and restrictions, we will disable the account.

These policies apply across all EU Member States.

SLI 18.2.1

Relevant Signatories will report on actions taken in response to violations of policies relevant to Measure 18.2, at the Member State level. The metrics shall include: Total number of violations and Meaningful metrics to measure the impact of these actions (such as their impact on the visibility of or the engagement with content that was actioned upon).

Number of unique contents that were removed from Facebook for violating our harmful health misinformation or inauthentic behavior or voter or census interference policies in EEA Member State countries from 01/07/2025 to 31/12/2025.

Country determined by inferred user (responsible for the content) location.

Country removal actions taken in response to policy violations demotion actions taken in response to likely misinformation Metric 2: indicating the impact of the action taken Metric 3: indicating the impact of the action taken
Austria Over 41,000 Over 82,000 0 0
Belgium Over 60,000 Over 110,000 0 0
Bulgaria Over 170,000 Over 440,000 0 0
Croatia Over 36,000 Over 120,000 0 0
Cyprus Over 11,000 Over 28,000 0 0
Czech Republic Over 100,000 Over 230,000 0 0
Denmark Over 20,000 Over 40,000 0 0
Estonia Over 7,000 Over 24,000 0 0
Finland Over 8,900 Over 17,000 0 0
France Over 420,000 Over 1,000,000 0 0
Germany Over 320,000 Over 700,000 0 0
Greece Over 160,000 Over 350,000 0 0
Hungary Over 26,000 Over 61,000 0 0
Ireland Over 26,000 Over 61,000 0 0
Italy Over 680,000 Over 1,000,000 0 0
Latvia Over 24,000 Over 58,000 0 0
Lithuania Over 22,000 Over 61,000 0 0
Luxembourg Over 4,600 Over 8,700 0 0
Malta Over 3,300 Over 5,500 0 0
Netherlands Over 59,000 Over 97,000 0 0
Poland Over 260,000 Over 600,000 0 0
Portugal Over 75,000 Over 210,000 0 0
Romania Over 380,000 Over 610,000 0 0
Slovakia Over 83,000 Over 160,000 0 0
Slovenia Over 13,000 Over 32,000 0 0
Spain Over 380,000 Over 840,000 0 0
Sweden Over 37,000 Over 78,000 0 0
Iceland Over 1,500 Over 3,100 0 0
Liechtenstein Less than 100 Over 170 0 0
Norway Over 18,000 Over 26,000 0 0

Measure 18.3

Relevant Signatories will invest and/or participate in research efforts on the spread of harmful Disinformation online and related safe design practices, will make findings available to the public or report on those to the Code's taskforce. They will disclose and discuss findings within the permanent Task-force, and explain how they intend to use these findings to improve existing safe design practices and features or develop new ones.

Facebook

QRE 18.3.1

Relevant Signatories will describe research efforts, both in-house and in partnership with third-party organisations, on the spread of harmful Disinformation online and relevant safe design practices, as well as actions or changes as a result of this research. Relevant Signatories will include where possible information on financial investments in said research. Wherever possible, they will make their findings available to the general public.

As noted in our baseline report, the following are some key initiatives we have supported to empower the independent research community and to help us gain a better understanding of what our users want, need and expect: such as Social Science Research, AI for Good, and the Influence Operations Research Archive for coordinated inauthentic behaviour (CIB) network disruptions.

Research Grants & Awards. In our baseline report, we mentioned that every year, we invest in numerous research projects as part of our overall efforts to make the internet and people on our platforms safer and more secure. Details of our most recent awards can be found here.

Commitment 19

Relevant Signatories using recommender systems commit to make them transparent to the recipients regarding the main criteria and parameters used for prioritising or deprioritising information, and provide options to users about recommender systems, and make available information on those options.

We signed up to the following measures of this commitment

Measure 19.1 Measure 19.2

In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?

No

If yes, list these implementation measures here

There have been no significant updates since the last submitted report.

Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?

No

If yes, which further implementation measures do you plan to put in place in the next 6 months?

N/A

Measure 19.1

Relevant Signatories will make available to their users, including through the Transparency Centre and in their terms and conditions, in a clear, accessible and easily comprehensible manner, information outlining the main parameters their recommender systems employ.

Facebook

QRE 19.1.1

Relevant Signatories will provide details of the policies and measures put in place to implement the above-mentioned measures accessible to EU users, especially by publishing information outlining the main parameters their recommender systems employ in this regard. This information should also be included in the Transparency Centre.

The range of measures and policies put in place in relation to this measure have been described in previous reports and are explained in greater detail on Meta’s Transparency Centre. For example, there it is possible to find detailed explanations relating to Facebook System Cards that help people understand how AI shapes their product experiences.

The policies outlined apply across all EU Member States.

Measure 19.2

Relevant Signatories will provide options for the recipients of the service to select and to modify at any time their preferred options for relevant recommender systems, including giving users transparency about those options.

Facebook

SLI 19.2.1

Relevant Signatories will provide aggregated information on effective user settings, such as the number of times users have actively engaged with these settings within the reporting period or over a sample representative timeframe, and clearly denote shifts in configuration patterns.

We were not able to deliver this SLI for this report. 

Country No of times users actively engaged with these settings
Austria 0
Belgium 0
Bulgaria 0
Croatia 0
Cyprus 0
Czech Republic 0
Denmark 0
Estonia 0
Finland 0
France 0
Germany 0
Greece 0
Hungary 0
Ireland 0
Italy 0
Latvia 0
Lithuania 0
Luxembourg 0
Malta 0
Netherlands 0
Poland 0
Portugal 0
Romania 0
Slovakia 0
Slovenia 0
Spain 0
Sweden 0
Iceland 0
Liechtenstein 0
Norway 0

Commitment 21

Relevant Signatories commit to strengthen their efforts to better equip users to identify Disinformation. In particular, in order to enable users to navigate services in an informed way, Relevant Signatories commit to facilitate, across all Member States languages in which their services are provided, user access to tools for assessing the factual accuracy of sources through fact-checks from fact-checking organisations that have flagged potential Disinformation, as well as warning labels from other authoritative sources.

We signed up to the following measures of this commitment

Measure 21.3

In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?

No

If yes, list these implementation measures here

There have been no significant updates since the last submitted report.

Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?

No

If yes, which further implementation measures do you plan to put in place in the next 6 months?

N/A

Commitment 23

Relevant Signatories commit to provide users with the functionality to flag harmful false and/or misleading information that violates Signatories policies or terms of service.

We signed up to the following measures of this commitment

Measure 23.1 Measure 23.2

In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?

No

If yes, list these implementation measures here

There have been no significant updates since the last submitted report.

Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?

No

If yes, which further implementation measures do you plan to put in place in the next 6 months?

N/A

Measure 23.1

Relevant Signatories will develop or continue to make available on all their services and in all Member States languages in which their services are provided a user-friendly functionality for users to flag harmful false and/or misleading information that violates Signatories' policies or terms of service. The functionality should lead to appropriate, proportionate and consistent follow-up actions, in full respect of the freedom of expression.

Facebook

QRE 23.1.1

Relevant Signatories will report on the availability of flagging systems for their policies related to harmful false and/or misleading information across EU Member States and specify the different steps that are required to trigger the systems.

As mentioned in our baseline report, users can report content that they specifically identified as false information through the following process outlined on our website

We also provide an appeal system. More details about these systems can be found in our baseline report. 

Measure 23.2

Relevant Signatories will take the necessary measures to ensure that this functionality is duly protected from human or machine-based abuse (e.g., the tactic of 'mass-flagging' to silence other voices).

Facebook

QRE 23.2.1

Relevant Signatories will report on the general measures they take to ensure the integrity of their reporting and appeals systems, while steering clear of disclosing information that would help would-be abusers find and exploit vulnerabilities in their defences.

Meta’s processes include measures to uphold the integrity of our reporting and appeals systems. 

Mass reporting: We do not remove pieces of content based on the number of reports we receive. If a piece of content violates our Community Standards, one report is enough for us to remove it. If it does not violate our Community Standards, the number of reports will not lead to the content being removed, no matter how high.

Because of the volume of content we review across our platforms, we always need to prioritise cases for our content moderators, and we do that based on severity and virality. The amount of reports does not impact response times or enforcement decisions. 

Protection against misuse: We may suspend the processing of notices and complaints submitted through our notice and complaints mechanisms, for a limited period of time, where individuals and entities have, after being warned, frequently submitted notices and complaints that are manifestly unfounded.

Anonymous reporting: When something gets reported to Facebook, we'll review it and take action on anything we determine doesn't follow our Community Standards. Unless a user is reporting an incident of intellectual property infringement, their report will be kept confidential and the account that was reported won’t see who reported them.

Commitment 24

Relevant Signatories commit to inform users whose content or accounts has been subject to enforcement actions (content/accounts labelled, demoted or otherwise enforced on) taken on the basis of violation of policies relevant to this section (as outlined in Measure 18.2), and provide them with the possibility to appeal against the enforcement action at issue and to handle complaints in a timely, diligent, transparent, and objective manner and to reverse the action without undue delay where the complaint is deemed to be founded.

We signed up to the following measures of this commitment

Measure 24.1

In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?

No

If yes, list these implementation measures here

There have been no significant updates since the last submitted report.

Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?

No

If yes, which further implementation measures do you plan to put in place in the next 6 months?

N/A

Measure 24.1

Relevant Signatories commit to provide users with information on why particular content or accounts have been labelled, demoted, or otherwise enforced on, on the basis of violation of policies relevant to this section, as well as the basis for such enforcement action, and the possibility for them to appeal through a transparent mechanism.

Facebook

QRE 24.1.1

Relevant Signatories will report on the availability of their notification and appeals systems across Member States and languages and provide details on the steps of the appeals procedure.

When we remove a piece of content, we let the user know that something they posted goes against our Community Standards. Moreover, we are transparent with users when their content is fact-checked, and have an appeals process in place for users who wish to issue a correction or dispute a rating with a fact-checker.

Appeal procedures are outlined under QRE 23.1.1. 

SLI 24.1.1

Relevant Signatories provide information on the number and nature of enforcement actions for policies described in response to Measure 18.2, the numbers of such actions that were subsequently appealed, the results of these appeals, information, and to the extent possible metrics, providing insight into the duration or effectiveness of processing of appeals process, and publish this information on the Transparency Centre.

Number of unique contents that were removed from Facebook for violating our harmful health misinformation, inauthentic behavior or voter or census interference policies in EEA Member State countries from 01/07/2025 to 31/12/2025.

Number of unique contents that were removed from Facebook for violating our harmful health misinformation, inauthentic behavior or voter or census interference policies in EEA Member State countries from 01/07/2025 to 31/12/2025 that were later appealed.


Country Number of unique contents that were removed from Facebook for violating our harmful health misinformation, inauthentic behavior or voter or census interference policies in EEA Member State countries from 01/07/2025 to 31/12/2025. Number of unique contents that were removed from Facebook for violating our harmful health misinformation, inauthentic behavior or voter or census interference policies in EEA Member State countries from 01/07/2025 to 31/12/2025 that were later appealed. Metrics on results of appeals Metrics on the duration and effectiveness of the appeal process
Austria Over 41,000 Less than 100 0 0
Belgium Over 60,000 Less than 100 0 0
Bulgaria Over 170,000 Less than 100 0 0
Croatia Over 36,000 Less than 100 0 0
Cyprus Over 11,000 Less than 100 0 0
Czech Republic Over 100,000 Less than 100 0 0
Denmark Over 20,000 Less than 100 0 0
Estonia Over 7,000 Less than 100 0 0
Finland Over 8,900 Less than 100 0 0
France Over 420,000 Over 330 0 0
Germany Over 320,000 Over 710 0 0
Greece Over 160,000 Less than 100 0 0
Hungary Over 26,000 Less than 100 0 0
Ireland Over 26,000 Less than 100 0 0
Italy Over 680,000 Over 370 0 0
Latvia Over 24,000 Less than 100 0 0
Lithuania Over 22,000 Less than 100 0 0
Luxembourg Over 4,600 Less than 100 0 0
Malta Over 3,300 Less than 100 0 0
Netherlands Over 59,000 Over 220 0 0
Poland Over 260,000 Over 180 0 0
Portugal Over 75,000 Less than 100 0 0
Romania Over 380,000 Less than 100 0 0
Slovakia Over 83,000 Less than 100 0 0
Slovenia Over 13,000 Less than 100 0 0
Spain Over 380,000 Over 180 0 0
Sweden Over 37,000 Over 170 0 0
Iceland Over 1,500 Less than 100 0 0
Liechtenstein Less than 100 Less than 100 0 0
Norway Over 18,000 Less than 100 0 0

Empowering Researchers

Commitment 26

Relevant Signatories commit to provide access, wherever safe and practicable, to continuous, real-time or near real-time, searchable stable access to non-personal data and anonymised, aggregated, or manifestly-made public data for research purposes on Disinformation through automated means such as APIs or other open and accessible technical solutions allowing the analysis of said data.

We signed up to the following measures of this commitment

Measure 26.1 Measure 26.2 Measure 26.3

In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?

Yes

If yes, list these implementation measures here

As mentioned in our previous reports, Meta rolled out the Content Library and API tools to provide access to near real-time public content on Facebook. Details about the content, such as the number of reactions, shares, comments and, for the first time, post view counts are also available. Researchers can search, explore and filter that content on a graphical User Interface (UI) or through a programmatic API. 

Together, these tools provide comprehensive access to publicly-accessible content across Facebook and Instagram.

Individuals, including journalists, affiliated with qualified institutions pursuing scientific or public interest research topics can apply for access to these tools through partners with deep expertise in secure data sharing for research, starting with the University of Michigan’s Inter-university Consortium for Political and Social Research. This was a first-of-its-kind partnership that enabled researchers to analyse data from the API in ICPSR’s Social Media Archives (SOMAR) Virtual Data Enclave.

Furthermore, in December 2025, Meta launched a partnership with the Secure Data Access Center (CASD, Le Centre d’Accès Sécurisé aux Données), an organization renowned for facilitating responsible data access for researchers worldwide. As part of our collaboration, CASD independently reviews research proposals to access Meta Content Library. We also launched a new Meta-hosted application portal, Research Tools Manager, to enhance the onboarding and support experience for both new applicants and existing researchers. 

In addition, researchers also now are able to choose between accessing the Meta Content LIbrary API on the SOMAR Virtual Data Enclave or on the Meta Secure Research Environment (formerly known as Researcher Platform).

Note that ICPSR no longer reviews Meta Content Library applications as of December 2025, but they continue to host the Meta Content Library API in the SOMAR Virtual Data Enclave.

Additionally, we made updates to the Meta Research Tools Terms and Conditions.

Meta continues to publish reports with relevant data regarding content on Facebook via its Transparency Centre. We’ve shared our 2025 reports there:

Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?

Yes

If yes, which further implementation measures do you plan to put in place in the next 6 months?

We continue to, and are in process of adding new features and functionality to Meta Content Library, including improvements to the application processes for access to the research tools.  In addition to this, we regularly seek feedback from the research community for critical updates.  

Measure 26.1

Relevant Signatories will provide public access to non-personal data and anonymised, aggregated or manifestly-made public data pertinent to undertaking research on Disinformation on their services, such as engagement and impressions (views) of content hosted by their services, with reasonable safeguards to address risks of abuse (e.g. API policies prohibiting malicious or commercial uses).

Facebook

QRE 26.1.1

Relevant Signatories will describe the tools and processes in place to provide public access to non-personal data and anonymised, aggregated and manifestly-made public data pertinent to undertaking research on Disinformation, as well as the safeguards in place to address risks of abuse.

As mentioned in our baseline report, we publish a wide range of regular reports on our Transparency Centre including to give our community visibility into how we enforce our policies or respond to some requests: https://transparency.fb.com/data/. We also publish extensive reports on our findings about coordinated behaviour in our newsroom and we have a dedicated public website hosting our Ad Library tools.

QRE 26.1.2

Relevant Signatories will publish information related to data points available via Measure 26.1, as well as details regarding the technical protocols to be used to access these data points, in the relevant help centre. This information should also be reachable from the Transparency Centre. At minimum, this information will include definitions of the data points available, technical and methodological information about how they were created, and information about the representativeness of the data.

Ad Library Tools: The dedicated website for the Ad Library allows users to search all of the ads currently running across Meta technologies. All ads that are currently running on Meta technologies show: the ad content; the basic information, such as when the ad started running and which advertiser is running it. For the ads that have run anywhere in the European Union in the past year, it includes additional transparency specific to the EU. Regarding Ads about social issues, elections or politics that have run in the past seven years, it shows: the ad content, the basic information, such as when the ad started running and which advertiser is running it and additional transparency about spend, reach and funding entities.

As mentioned in our baseline report, we publish on our Transparency Centre numerous reports : 
  • Community Standards Enforcement Report: We publish this report publicly in our Transparency Centre on a quarterly basis to more effectively track our progress and demonstrate our continued commitment to making our services safe and inclusive. The report shares metrics on how we are doing at preventing and taking action on content that goes against our Community Standards (against 14 policies on Facebook). 
Adversarial Threat Report: We share publicly our findings about coordinated inauthentic behaviour (CIB) and other networks we detect and remove from our platforms. As part of our Adversarial Threat Reports, we publish information about networks we take down to make it easier for people to see progress we’re making in one place.

SLI 26.1.1

Relevant Signatories will provide quantitative information on the uptake of the tools and processes described in Measure 26.1, such as number of users.

As of 31 December 2025, over 1,200 researchers globally had access to Meta Content Library user interface and/or programmatic API. 

Country Nr of users of public access Other quantitative information on public access
Austria 0 0
Belgium 0 0
Bulgaria 0 0
Croatia 0 0
Cyprus 0 0
Czech Republic 0 0
Denmark 0 0
Estonia 0 0
Finland 0 0
France 0 0
Germany 0 0
Greece 0 0
Hungary 0 0
Ireland 0 0
Italy 0 0
Latvia 0 0
Lithuania 0 0
Luxembourg 0 0
Malta 0 0
Netherlands 0 0
Poland 0 0
Portugal 0 0
Romania 0 0
Slovakia 0 0
Slovenia 0 0
Spain 0 0
Sweden 0 0
Iceland 0 0
Liechtenstein 0 0
Norway 0 0

Measure 26.2

Relevant Signatories will provide real-time or near real-time, machine-readable access to non-personal data and anonymised, aggregated or manifestly-made public data on their service for research purposes, such as accounts belonging to public figures such as elected official, news outlets and government accounts subject to an application process which is not overly cumbersome.

Facebook

QRE 26.2.1

Relevant Signatories will describe the tools and processes in place to provide real-time or near real-time access to non-personal data and anonymised, aggregated and manifestly-made public data for research purposes as described in Measure 26.2.

Meta Content Library includes public posts and data on Facebook. Data from the Library can be searched, explored, and filtered on a graphical UI or through a programmatic API. 

Meta Content Library is a web-based, controlled-access environment where researchers can perform deeper analysis of the public content by using Content Library API in a secured clean room environment: 

  • Searching and filtering: searching public posts across Facebook and Instagram is easy with comprehensive sorting and filtering options. Post results can be filtered by language, view count, media type, content producer and more.
  • Multimedia: Photos, videos and reels are available for dynamic search, exploration and analysis.
  • Producer lists: customizable collections of content producers can be used to refine search results. Researchers can apply custom producer lists to a search query to surface public content from specific content owners on Facebook or Instagram.

Content Library API allows programmatic queries of the data and is designed for computational researchers. Data pulled from the API can be analysed in a secure platform: 

  • Endpoints and data fields: With 8 dedicated endpoints, the Content Library API can search across over 100 data fields from Facebook Pages, posts, , groups, events, and a subset of personal accounts.
  • Search indexing and results: Powerful search capabilities can return up to 100,000 results per query.
  • Asynchronous search: allows for queries to run in the background while a researcher works on other tasks. Query progress is monitored and tracked by the API.

For more details - see here

QRE 26.2.2

Relevant Signatories will describe the scope of manifestly-made public data as applicable to their services.

Meta Content Library and API provide near real-time public content from Facebook and Instagram. Details about the content, such as the post owner and the number of reactions and shares, are also available: 

  • Posts shared to and information about Pages, groups, events, and a subset of personal accounts.
  • Available for most countries and territories but excluded from countries where Meta is still evaluating legal and compliance requirements
  • The number of times a post or reel was displayed on screen

For more details - see here

QRE 26.2.3

Relevant Signatories will describe the application process in place to in order to gain the access to non-personal data and anonymised, aggregated and manifestly-made public data described in Measure 26.2.

Individuals, including journalists affiliated with qualified institutions pursuing scientific or public interest research topics are able to apply for access to these tools through a partner with deep expertise in secure data sharing for research, the University of Michigan’s Inter-university Consortium for Political and Social Research (ICPSR). 

Starting in December 2025, Meta launched a partnership with the Secure Data Access Center (CASD, Le Centre d’Accès Sécurisé aux Données), to review Meta Content Library applications. Note that ICPSR no longer reviews Meta Content Library applications, but they continue to host the Meta Content Library API in the SOMAR Virtual Data Enclave.

In addition, researchers also now are able to choose between accessing the Meta Content LIbrary API on the SOMAR Virtual Data Enclave or on the Meta Secure Research Environment (formerly known as Researcher Platform). 

For more details on the application process - see here

SLI 26.2.1

Relevant Signatories will provide meaningful metrics on the uptake, swiftness, and acceptance level of the tools and processes in Measure 26.2, such as: Number of monthly users (or users over a sample representative timeframe), Number of applications received, rejected, and accepted (over a reporting period or a sample representative timeframe), Average response time (over a reporting period or a sample representative timeframe).

As of  31 December 2025, 1,200 researchers globally had access to Meta Content Library user interface and/or programmatic API. 

Country No of monthly users No of applications received No of applications rejected No of applications accepted Average response time Other metrics
Austria 0 0 0 0 0 0
Belgium 0 0 0 0 0 0
Bulgaria 0 0 0 0 0 0
Croatia 0 0 0 0 0 0
Cyprus 0 0 0 0 0 0
Czech Republic 0 0 0 0 0 0
Denmark 0 0 0 0 0 0
Estonia 0 0 0 0 0 0
Finland 0 0 0 0 0 0
France 0 0 0 0 0 0
Germany 0 0 0 0 0 0
Greece 0 0 0 0 0 0
Hungary 0 0 0 0 0 0
Ireland 0 0 0 0 0 0
Italy 0 0 0 0 0 0
Latvia 0 0 0 0 0 0
Lithuania 0 0 0 0 0 0
Luxembourg 0 0 0 0 0 0
Malta 0 0 0 0 0 0
Netherlands 0 0 0 0 0 0
Poland 0 0 0 0 0 0
Portugal 0 0 0 0 0 0
Romania 0 0 0 0 0 0
Slovakia 0 0 0 0 0 0
Slovenia 0 0 0 0 0 0
Spain 0 0 0 0 0 0
Sweden 0 0 0 0 0 0
Iceland 0 0 0 0 0 0
Liechtenstein 0 0 0 0 0 0
Norway 0 0 0 0 0 0

Measure 26.3

Relevant Signatories will implement procedures for reporting the malfunctioning of access systems and for restoring access and repairing faulty functionalities in a reasonable time.

Facebook

QRE 26.3.1

Relevant Signatories will describe the reporting procedures in place to comply with Measure 26.3 and provide information about their malfunction response procedure, as well as about malfunctions that would have prevented the use of the systems described above during the reporting period and how long it took to remediate them.

We provide comprehensive developer documentation and in depth technical guides that walk through how to use the different tools directly on our website, which also include a dedicated help centre.  

Commitment 28

COOPERATION WITH RESEARCHERS Relevant Signatories commit to support good faith research into Disinformation that involves their services.

We signed up to the following measures of this commitment

Measure 28.1 Measure 28.2 Measure 28.3 Measure 28.4

In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?

Yes

If yes, list these implementation measures here

Meta continues to explore options for sharing insights with research groups on these issues, in addition to our sharing through the IO Research Archive and in our public Adversarial Threat Reports. 

As part of our ongoing efforts to enhance the Meta Content Library tool and incorporate feedback from researchers, we've introduced several improvements. We've made searching more efficient by adding exact phrase matching, text-in-image search, and researchers can now share content producer lists with their peers, enabling quick filtering of public data from specific content producers on Facebook.

Throughout the second half of 2025, Meta has continued to release new features and improvements to the MCL, including collaborative dashboard editing, comments filtering, and new tools in the API such as snapshots and collections. In addition, data coverage has expanded to include public profiles with 100 followers or more. These enhancements have been designed to support our users and promote best practices in independent research. 

We made changes to the Meta Research Tools Terms and Conditions which include granting researchers ownership of their research outputs (Section 2(q)), subject to compliance with the terms and applicable law.

Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?

Yes

If yes, which further implementation measures do you plan to put in place in the next 6 months?

We continue to, and are in process of adding new features and functionality to Meta Content Library, including streamlining application processes for access to the research tools. In addition to this, we regularly seek feedback from the research community for critical updates. By developing these tools and supporting the research community we continue to support good faith research. 

Measure 28.1

Relevant Signatories will ensure they have the appropriate human resources in place in order to facilitate research, and should set-up and maintain an open dialogue with researchers to keep track of the types of data that are likely to be in demand for research and to help researchers find relevant contact points in their organisations.

Facebook

QRE 28.1.1

Relevant Signatories will describe the resources and processes they deploy to facilitate research and engage with the research community, including e.g. dedicated teams, tools, help centres, programs, or events.

As mentioned in our baseline report, Meta has a team dedicated to providing academics and independent researchers with the tools and data they need to study Meta’s impact on the world.

Relevant details about research tools are available on our Transparency Centre.

Measure 28.2

Relevant Signatories will be transparent on the data types they currently make available to researchers across Europe.

Facebook

QRE 28.2.1

Relevant Signatories will describe what data types European researchers can currently access via their APIs or via dedicated teams, tools, help centres, programs, or events.

As mentioned in our baseline report, Meta provides a variety of data sets and tools for researchers and they can consult a chart to verify if the data would be available for request. All the data access opportunities for independent researchers are logged in one place

The main data available only to researchers are: 
  • Meta Content Library and API. The Library includes data from certain public profiles, public posts, pages, groups, and events on Facebook. Data from the Library can be searched, explored, and filtered on a graphical user interface or through a programmatic API. 1,200+ researchers globally now have access to Meta Content Library user interface and/or programmatic API. 
  • Ad Targeting dataset, which includes detailed targeting information for social issue, electoral, and political ads that ran globally since August 2020. 200+ researchers globally have accessed Ads Targeting dataset since it launched publicly in Sept 2022.
  • URL Shares Data Set, which includes differentially private individual-level counts of the number of people who viewed, clicked, liked, commented, shared, or reacted to any URL on Facebook between January 2017 and September 2022. Counts are aggregated at the level of country, year-month, age bracket, gender.  Access to the URL Shares is granted by Social Science One, and new researchers are onboarded once per quarter. 200+ researchers globally have accessed the URL Shares dataset since its release in February 2020.
  • Influence Operations Research Archive for coordinated inauthentic behaviour (CIB) network disruptions, as outlined in QRE 27.4.1.
  • AI for GoodProvides a range of maps that make our data easier to understand.

Measure 28.3

Relevant Signatories will not prohibit or discourage genuinely and demonstratively public interest good faith research into Disinformation on their platforms, and will not take adversarial action against researcher users or accounts that undertake or participate in good-faith research into Disinformation.

Facebook

QRE 28.3.1

Relevant Signatories will collaborate with EDMO to run an annual consultation of European researchers to assess whether they have experienced adversarial actions or are otherwise prohibited or discouraged to run such research.

Our engagement with researchers and EDMO stakeholders on the MCL + API included two main events in Berlin. First, we hosted a data dialogue specifically to solicit feedback on the MCL + API, where we invited several EDMO stakeholders. Second, we engaged with them further during the DSA Access Days conference in September 2025.

Measure 28.4

As part of the cooperation framework between the Signatories and the European research community, relevant Signatories will, with the assistance of the EDMO, make funds available for research on Disinformation, for researchers to independently manage and to define scientific priorities and transparent allocation procedures based on scientific merit.

Facebook

QRE 28.4.1

Relevant Signatories will disclose the resources made available for the purposes of Measure 28.4 and procedures put in place to ensure the resources are independently managed.

No reporting possible at this stage 

Empowering fact-checkers

Commitment 30

Relevant Signatories commit to establish a framework for transparent, structured, open, financially sustainable, and non-discriminatory cooperation between them and the EU fact-checking community regarding resources and support made available to fact-checkers.

We signed up to the following measures of this commitment

Measure 30.1 Measure 30.2 Measure 30.3 Measure 30.4

In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?

Yes

If yes, list these implementation measures here

Meta continued providing all third-party fact-checkers (3PFCs) participating in our fact-checking programs with access to the Meta Content Library (MCL). This initiative aimed to enhance the fact-checking workflow and provide users with a more comprehensive toolset.

Throughout the second half of 2025, Meta has continued to release new features and improvements to the MCL, including new collaborative tools in the API such as snapshots and collections. In addition, data coverage has expanded to include public profiles with 100 followers or more.  collaborative dashboard editing, comment filtering, and filtering by account verified status. These enhancements have been designed to support our users and promote best practices in fact checking.



Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?

No

If yes, which further implementation measures do you plan to put in place in the next 6 months?

As currently drafted, this chapter covers the current practices for Facebook and Instagram in the EU. In keeping with Meta’s public announcements on 7 January 2025, we will continue to assess the applicability of this chapter to Facebook and Instagram and we will keep under review whether it is appropriate to make alterations in light of changes in our practices, such as the deployment of Community Notes.

Measure 30.1

Relevant Signatories will set up agreements between them and independent fact-checking organisations (as defined in whereas (e)) to achieve fact-checking coverage in all Member States. These agreements should meet high ethical and professional standards and be based on transparent, open, consistent and non-discriminatory conditions and will ensure the independence of fact-checkers.

Facebook

QRE 30.1.1

Relevant Signatories will report on and explain the nature of their agreements with fact-checking organisations; their expected results; relevant quantitative information (for instance: contents fact-checked, increased coverage, changes in integration of fact-checking as depends on the agreements and to be further discussed within the Task-force); and such as relevant common standards and conditions for these agreements.

As mentioned in our baseline report, Meta’s fact-checking partners all go through a rigorous certification process with the IFCN. As a subsidiary of the journalism research organisation Poynter Institute, the IFCN is dedicated to bringing fact-checkers together worldwide.
All fact-checking partners follow IFCN’s Code of Principles, a series of commitments they must adhere to in order to promote excellence in fact-checking.

The detail of our partnership with fact-checkers (i.e., how they rate content and what actions we take as a result) is outlined in QRE 21.1.1 and here.

QRE 30.1.3

Relevant Signatories will report on resources allocated where relevant in each of their services to achieve fact-checking coverage in each Member State and to support fact-checking organisations' work to combat Disinformation online at the Member State level.

As mentioned in our baseline report, the list of fact-checkers with whom we partner across the EU is in QRE 30.1.2. 


SLI 30.1.1

Relevant Signatories will report on Member States and languages covered by agreements with the fact-checking organisations, including the total number of agreements with fact-checking organisations, per language and, where relevant, per service.

Number of individual agreements we have with fact-checking organisations. Each agreement covers both Facebook and Instagram. 


Country Number of individual agreements we have with fact-checking organisations. Each agreement covers both Facebook and Instagram.
Austria AFP dpa-Faktencheck
Belgium AFP dpa-Faktencheck Knack
Bulgaria AFP FactCheck.bg
Croatia Faktograf.hr AFP
Cyprus AFP
Czech Republic AFP Demagog.cz
Denmark TjekDet
Estonia Delfi Estonia/Ekspress M
Finland AFP
France 20 Minutes AFP Les Observateurs de France 24 Les Surligneurs
Germany AFP Correctiv dpa-Faktencheck
Greece AFP Ellinika Hoaxes
Hungary AFP
Ireland TheJournal.ie
Italy Open Pagella Politica
Latvia Delfi Re:Baltica
Lithuania Delfi Patikrinta 15min
Luxembourg dpa-Faktencheck
Malta 0
Netherlands AFP dpa-Faktencheck
Poland AFP Demagog
Portugal Poligrafo Observador
Romania AFP Funky Citizens/ Factual.ro
Slovakia AFP Demagog.cz Demagog.sk
Slovenia Oštro
Spain AFP EFE Verifica Maldito Bulo Newtral
Sweden Kallkritikbyran AFP
Iceland 0
Liechtenstein 0
Norway 0

Measure 30.2

Relevant Signatories will provide fair financial contributions to the independent European fact-checking organisations for their work to combat Disinformation on their services. Those financial contributions could be in the form of individual agreements, of agreements with multiple fact-checkers or with an elected body representative of the independent European fact-checking organisations that has the mandate to conclude said agreements.

Facebook

QRE 30.2.1

Relevant Signatories will report on actions taken and general criteria used to ensure the fair financial contributions to the fact-checkers for the work done, on criteria used in those agreements to guarantee high ethical and professional standards, independence of the fact-checking organisations, as well as conditions of transparency, openness, consistency and non-discrimination.

As mentioned in our baseline report, Meta’s fact-checking partners all go through a rigorous certification process with the IFCN. All our fact-checking partners follow IFCN’s Code of Principles, a series of commitments they must adhere to in order to promote excellence in fact-checking.

From 2024, third-party fact-checkers may also be onboarded to Meta if they are certified with the European Fact-Checking Standards Networks (EFCSN).


QRE 30.2.2

Relevant Signatories will engage in, and report on, regular reviews with their fact-checking partner organisations to review the nature and effectiveness of the Signatory's fact-checking programme.

As mentioned in our baseline report, Meta has a team in charge of maintaining our relationships with our fact-checking partners, understanding their feedback and improving our fact-checking program together. As part of this work, our team initiates regular initiatives to collect views and feedback via conversations, surveys or other tools. 

Meta has also dedicated the necessary resources to engage with the Taskforce including on work-streams related to fact-checking. 


QRE 30.2.3

European fact-checking organisations will, directly (as Signatories to the Code) or indirectly (e.g. via polling by EDMO or an elected body representative of the independent European fact-checking organisations) report on the fairness of the individual compensations provided to them via these agreements.

QRE 30.2.3 applies to fact-checking organisations


Measure 30.3

Relevant Signatories will contribute to cross-border cooperation between fact-checkers.

Facebook

QRE 30.3.1

Relevant Signatories will report on actions taken to facilitate their cross-border collaboration with and between fact-checkers, including examples of fact-checks, languages, or Member States where such cooperation was facilitated.

As outlined in QRE 30.2.2 Meta has a team in charge of our relationships with fact-checking partners where we take on feedback including on ways to support their cooperation.


Measure 30.4

To develop the Measures above, relevant Signatories will consult EDMO and an elected body representative of the independent European fact-checking organisations.

Facebook

QRE 30.4.1

Relevant Signatories will report, ex ante on plans to involve, and ex post on actions taken to involve, EDMO and the elected body representative of the independent European fact-checking organisations, including on the development of the framework of cooperation described in Measures 30.3 and 30.4.

As mentioned in our baseline report, Facebook is in touch with several EDMO regional hubs and looks forward to engaging with EDMO on our fact-checking efforts.

Commitment 31

Relevant Signatories commit to integrate, showcase, or otherwise consistently use fact-checkers' work in their platforms' services, processes, and contents; with full coverage of all Member States and languages.

We signed up to the following measures of this commitment

Measure 31.1 and 31.2

In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?

No

If yes, list these implementation measures here

There have been no updates since the last submitted report.

Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?

No

If yes, which further implementation measures do you plan to put in place in the next 6 months?

As currently drafted, this chapter covers the current practices for Facebook and Instagram in the EU. In keeping with Meta’s public announcements on 7 January 2025, we will continue to assess the applicability of this chapter to Facebook and Instagram and we will keep under review whether it is appropriate to make alterations in light of changes in our practices, such as the deployment of Community Notes.

Measure 31.1 and 31.2

31.1: Relevant Signatories that showcase User Generated Content (UGC) will integrate, showcase, or otherwise consistently use independent fact-checkers’ work in their platforms’ services, processes, and contents across all Member States and across formats relevant to the service. Relevant Signatories will collaborate with fact-checkers to that end, starting by conducting and documenting research and testing. 31.2: Relevant Signatories that integrate fact-checks in their products or processes will ensure they employ swift and efficient mechanisms such as labelling, information panels or policy enforcement to help increase the impact of fact-checks on audiences.

Facebook

QRE 31.1.1 (for Measures 31.1 and 31.2)

Relevant Signatories will report on their specific activities and initiatives related to Measures 31.1 and 31.2, including the full results and methodology applied in testing solutions to that end.

As mentioned in our baseline report, when content has been rated by fact-checkers (as outlined in detail under QRE 21.1.1), We take action to (1) label it and (2) ensure less people see it. We also take action against accounts that repeatedly share misinformation. The current warning in place says that accounts that repeatedly share false information may experience temporary restrictions, including having their posts reduced.

Regarding rating AI-generated content. Fact-checkers may rate AI-generated media under our fact-checking program policies. They often rely on AI experts, visual techniques, and meta data analysis to aid in the detection of this content.

SLI 31.1.1

Member State level reporting on use of fact-checks by service and the swift and efficient mechanisms in place to increase their impact, which may include (as depends on the service): number of fact-check articles published; reach of fact-check articles; number of content pieces reviewed by fact-checkers.

Filtered to content created on Facebook in EEA Member State countries from 01/07/2025 to 31/12/2025: 

1. Number of distinct pieces of content viewed on Facebook that were treated with a fact-checking label due to a falsity assessment by third party fact-checkers between 01/07/2025 to 31/12/2025:.
2. Number of distinct articles written by 3PFCs that were used on Facebook to apply an inform treatment to a content from 01/07/2025 to 31/12/2025:*

*This metric shows the number of distinct fact-checking articles written by Meta’s 3PFC partners and utilised to label content in each EEA Member State. As articles may be used in multiple countries, and several articles may be used to label a piece of content, the total sum of articles utilised for all Member States exceeds the number of distinct articles created in the EEA (120,000). This is expected. 

Country Content viewed on Facebook and treated with fact checks, due to a falsity assessment by third party fact checkers between 01/07/2025 to 31/12/2025: Number of Articles written by third party fact checkers to justify rating on Facebook between 01/07/2025 to 31/12/2025: Nr of content pieces reviewed by fact-checkers Other
Austria Over 490,000 Over 33,000 0 0
Belgium Over 730,000 Over 40,000 0 0
Bulgaria Over 570,000 Over 23,000 0 0
Croatia Over 370,000 Over 23,000 0 0
Cyprus Over 150,000 Over 18,000 0 0
Czech Republic Over 460,000 Over 24,000 0 0
Denmark Over 370,000 Over 25,000 0 0
Estonia Over 76,000 Over 10,000 0 0
Finland Over 170,000 Over 20,000 0 0
France Over 3,200,000 Over 60,000 0 0
Germany Over 2,700,000 Over 68,000 0 0
Greece Over 760,000 Over 32,000 0 0
Hungary Over 320,000 Over 22,000 0 0
Ireland Over 450,000 Over 32,000 0 0
Italy Over 2,900,000 Over 62,000 0 0
Latvia Over 130,000 Over 12,000 0 0
Lithuania Over 190,000 Over 16,000 0 0
Luxembourg Over 75,000 Over 15,000 0 0
Malta Over 68,000 Over 13,000 0 0
Netherlands Over 780,000 Over 43,000 0 0
Poland Over 1,400,000 Over 38,000 0 0
Portugal Over 920,000 Over 38,000 0 0
Romania Over 820,000 Over 30,000 0 0
Slovakia Over 280,000 Over 19,000 0 0
Slovenia Over 180,000 Over 16,000 0 0
Spain Over 2,500,000 Over 58,000 0 0
Sweden Over 530,000 Over 35,000 0 0
Iceland Over 37,000 Over 8,400 0 0
Liechtenstein Over 2,900 Over 1,600 0 0
Norway Over 295,000 Over 27,000 0 0

SLI 31.1.2

An estimation, through meaningful metrics, of the impact of actions taken such as, for instance, the number of pieces of content labelled on the basis of fact-check articles, or the impact of said measures on user interactions with information fact-checked as false or misleading.

1.  Number of distinct pieces of content viewed on Facebook that were treated with a fact-checking label due to a falsity assessment by third party fact checkers between 01/07/2025 to 31/12/2025.
2. Rate of reshare non-completion among the unique attempts by users to reshare a content on Facebook that was treated with a fact-checking label in EU Member State countries from 01/07/2025 to 31/12/2025.


Country Content viewed on Facebook and treated with fact checks, due to a falsity assessment by third party fact checkers between 01/07/2025 to 31/12/2025. % of reshares attempted that were not completed on treated content - Facebook between 01/07/2025 to 31/12/2025. Other
Austria Over 490,000 51.00% 0
Belgium Over 730,000 50.60% 0
Bulgaria Over 570,000 56.70% 0
Croatia Over 370,000 56.10% 0
Cyprus Over 150,000 61.10% 0
Czech Republic Over 460,000 38.20% 0
Denmark Over 370,000 52.70% 0
Estonia Over 76,000 44.40% 0
Finland Over 170,000 43.90% 0
France Over 3,200,000 57.80% 0
Germany Over 2,700,000 49.50% 0
Greece Over 760,000 58.00% 0
Hungary Over 320,000 53.30% 0
Ireland Over 450,000 51.20% 0
Italy Over 2,900,000 55.20% 0
Latvia Over 130,000 43.30% 0
Lithuania Over 190,000 49.10% 0
Luxembourg Over 75,000 50.00% 0
Malta Over 68,000 61.10% 0
Netherlands Over 780,000 44.20% 0
Poland Over 1,400,000 49.00% 0
Portugal Over 920,000 62.40% 0
Romania Over 820,000 28.70% 0
Slovakia Over 280,000 38.80% 0
Slovenia Over 180,000 47.90% 0
Spain Over 2,500,000 60.30% 0
Sweden Over 530,000 53.20% 0
Iceland Over 37,000 55.90% 0
Liechtenstein Over 2,900 100.00% 0
Norway Over 295,000 45.00% 0

SLI 31.1.3

Signatories recognise the importance of providing context to SLIs 31.1.1 and 31.1.2 in ways that empower researchers, fact-checkers, the Commission, ERGA, and the public to understand and assess the impact of the actions taken to comply with Commitment 31. To that end, relevant Signatories commit to include baseline quantitative information that will help contextualise these SLIs. Relevant Signatories will present and discuss within the Permanent Task-force the type of baseline quantitative information they consider using for contextualisation ahead of their baseline reports.

Average of monthly active users on Facebook in the European Union between 01/07/2025 to 31/12/2025.

There have been no significant updates since the last submitted report.

Over a 6-month period, ending 31 December 2025 (i.e., 1 July 2025 - 31 December 2025), there were a total of approximately 263 million average monthly active users on Facebook in the EU. For monthly active user numbers at a Member State level, please refer to our most recent Facebook DSA transparency report

Country
Austria 0
Belgium 0
Bulgaria 0
Croatia 0
Cyprus 0
Czech Republic 0
Denmark 0
Estonia 0
Finland 0
France 0
Germany 0
Greece 0
Hungary 0
Ireland 0
Italy 0
Latvia 0
Lithuania 0
Luxembourg 0
Malta 0
Netherlands 0
Poland 0
Portugal 0
Romania 0
Slovakia 0
Slovenia 0
Spain 0
Sweden 0
Iceland 0
Liechtenstein 0
Norway 0

Commitment 32

Relevant Signatories commit to provide fact-checkers with prompt, and whenever possible automated, access to information that is pertinent to help them to maximise the quality and impact of fact-checking, as defined in a framework to be designed in coordination with EDMO and an elected body representative of the independent European fact-checking organisations.

We signed up to the following measures of this commitment

Measure 32.1 and 32.2 Measure 32.3

In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?

No

If yes, list these implementation measures here

As mentioned in our baseline report, fact-checkers can identify hoaxes based on their own reporting, and Meta also surfaces potential misinformation to fact-checkers using signals, such as feedback from our community or similarity detection. Our technology can detect posts that are likely to be misinformation based on various signals, including how people are responding and how fast the content is spreading. We may also send content to fact-checkers when we become aware that it may contain misinformation.

Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?

No

If yes, which further implementation measures do you plan to put in place in the next 6 months?

As currently drafted, this chapter covers the current practices for Facebook and Instagram in the EU. In keeping with Meta’s public announcements on 7 January 2025, we will continue to assess the applicability of this chapter to Facebook and Instagram and we will keep under review whether it is appropriate to make alterations in light of changes in our practices, such as the deployment of Community Notes.

Measure 32.1 and 32.2

32.1: Relevant Signatories will provide fact-checkers with information to help them quantify the impact of fact-checked content over time, such as (depending on the service) actions taken on the basis of that content, impressions, clicks or interactions. 32.2: Relevant Signatories will provide fact-checkers with information to help them quantify the impact of fact-checked content over time, such as (depending on the service) actions taken on the basis of that content, impressions, clicks, or interactions.

Facebook

QRE 32.1.1 (for Measures 32.1 and 32.2)

Relevant Signatories will provide details on the interfaces and other tools put in place to provide fact-checkers with the information referred to in Measure 31.1 and 31.2.

As mentioned in our baseline report, all of our fact-checking partners have access to a dashboard that we built in 2016, specifically for our fact-checking program. The dashboard includes a variety of content formats across Facebook, including links, videos, images and text-only posts. It also provides data points to help fact-checkers prioritise what content to review. Fact-checkers then review the content, check the facts, and rate the accuracy. This process occurs independently from Meta and may include calling sources, consulting public data, authenticating images and videos and more. 


SLI 32.1.1 (for Measures 32.1 and 32.2)

Relevant Signatories will provide quantitative information on the use of the interfaces and other tools put in place to provide fact-checkers with the information referred to in Measures 32.1 and 32.2 (such as monthly users for instance).

See list in QRE 30.1.2 - all our third-party fact-checking partners have access to the same resources.

Country Monthly users Other
Austria 0 0
Belgium 0 0
Bulgaria 0 0
Croatia 0 0
Cyprus 0 0
Czech Republic 0 0
Denmark 0 0
Estonia 0 0
Finland 0 0
France 0 0
Germany 0 0
Greece 0 0
Hungary 0 0
Ireland 0 0
Italy 0 0
Latvia 0 0
Lithuania 0 0
Luxembourg 0 0
Malta 0 0
Netherlands 0 0
Poland 0 0
Portugal 0 0
Romania 0 0
Slovakia 0 0
Slovenia 0 0
Spain 0 0
Sweden 0 0
Iceland 0 0
Liechtenstein 0 0
Norway 0 0

Measure 32.3

Relevant Signatories will regularly exchange information between themselves and the fact-checking community, to strengthen their cooperation.

Facebook

QRE 32.3.1

Relevant Signatories will report on the channels of communications and the exchanges conducted to strengthen their cooperation - including success of and satisfaction with the information, interface, and other tools referred to in Measures 32.1 and 32.2 - and any conclusions drawn from such exchanges.

There have been no significant updates since the last submitted report.


Transparency Centre

Commitment 34

To ensure transparency and accountability around the implementation of this Code, Relevant Signatories commit to set up and maintain a publicly available common Transparency Centre website.

We signed up to the following measures of this commitment

Measure 34.1 Measure 34.2 Measure 34.3 Measure 34.4 Measure 34.5

In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?

Yes

If yes, list these implementation measures here

As mentioned in our baseline report, Meta (representing Facebook, Instagram, [WhatsApp and Messenger]) co-funded the Transparency Centre website’s development, to ensure transparency and accountability around the implementation of this Code.

Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?

No

If yes, which further implementation measures do you plan to put in place in the next 6 months?

N/A

Measure 34.1

Signatories establish and maintain the common Transparency Centre website, which will be operational and available to the public within 6 months from the signature of this Code.

Facebook

Measure 34.2

Signatories provide appropriate funding, for setting up and operating the Transparency Centre website, including its maintenance, daily operation, management, and regular updating. Funding contribution should be commensurate with the nature of the Signatories' activity and shall be sufficient for the website's operations and maintenance and proportional to each Signatories' risk profile and economic capacity.

Facebook

Measure 34.3

Relevant Signatories will contribute to the Transparency Centre's information to the extent that the Code is applicable to their services.

Facebook

Measure 34.4

Signatories will agree on the functioning and financing of the Transparency Centre within the Task-force, to be recorded and reviewed within the Task-Force on an annual basis.

Facebook

Measure 34.5

The Task-force will regularly discuss the Transparency Centre and assess whether adjustments or actions are necessary. Signatories commit to implement the actions and adjustments decided within the Task-force within a reasonable timeline.

Facebook

Commitment 35

Signatories commit to ensure that the Transparency Centre contains all the relevant information related to the implementation of the Code's Commitments and Measures and that this information is presented in an easy-to-understand manner, per service, and is easily searchable.

We signed up to the following measures of this commitment

Measure 35.1 Measure 35.2 Measure 35.3 Measure 35.4 Measure 35.5 Measure 35.6

In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?

Yes

If yes, list these implementation measures here

As mentioned in our baseline report, Meta (representing Facebook, Instagram, WhatsApp and Messenger) commits to upload its reports on the Transparency Centre in due course. 


Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?

No

If yes, which further implementation measures do you plan to put in place in the next 6 months?

As mentioned in our baseline report, Meta (representing Facebook, Instagram, WhatsApp and Messenger) commits to upload its reports on the Transparency Centre in due course. 

Measure 35.1

Signatories will list in the Transparency Centre, per each Commitment and Measure that they subscribe to, the terms of service and policies that their service applies to implement these Commitments and Measures.

Facebook

Measure 35.2

Signatories provide information on the implementation and enforcement of their policies per service, including geographical and language coverage.

Facebook

Measure 35.3

Signatories ensure that the Transparency Centre contains a repository of their reports assessing the implementation of the Code's commitments.

Facebook

Measure 35.4

In crisis situations, Signatories use the Transparency Centre to publish information regarding the specific mitigation actions taken related to the crisis.

Facebook

Measure 35.5

Signatories ensure that the Transparency Centre is built with state-of-the-art technology, is user-friendly, and that the relevant information is easily searchable (including per Commitment and Measure). Users of the Transparency Centre will be able to easily track changes in Signatories' policies and actions.

Facebook

Measure 35.6

The Transparency Centre will enable users to easily access and understand the Service Level Indicators and Qualitative Reporting Elements tied to each Commitment and Measure of the Code for each service, including Member State breakdowns, in a standardised and searchable way. The Transparency Centre should also enable users to easily access and understand Structural Indicators for each Signatory.

Facebook

Commitment 36

Signatories commit to updating the relevant information contained in the Transparency Centre in a timely and complete manner.

We signed up to the following measures of this commitment

Measure 36.1 Measure 36.2 Measure 36.3

In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?

Yes

If yes, list these implementation measures here

As mentioned in our baseline report, Meta (representing Facebook, Instagram, WhatsApp and Messenger) will both upload this report in due course and support other signatories in their efforts to upload their own reports.


Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?

Yes

If yes, which further implementation measures do you plan to put in place in the next 6 months?

As mentioned in our baseline report, Meta (representing Facebook, Instagram, WhatsApp and Messenger) will both upload all future reports in due course.

Measure 36.1

Signatories provide updates about relevant changes in policies and implementation actions in a timely manner, and in any event no later than 30 days after changes are announced or implemented.

Facebook

Measure 36.2

Signatories will regularly update Service Level Indicators, reporting elements, and Structural Indicators, in parallel with the regular reporting foreseen by the monitoring framework. After the first reporting period, Relevant Signatories are encouraged to also update the Transparency Centre more regularly.

Facebook

Measure 36.3

Signatories will update the Transparency Centre to reflect the latest decisions of the Permanent Task-force, regarding the Code and the monitoring framework.

Facebook

QRE 36.1.1 (for the Commitments 34-36)

With their initial implementation report, Signatories will outline the state of development of the Transparency Centre, its functionalities, the information it contains, and any other relevant information about its functioning or operations. This information can be drafted jointly by Signatories involved in operating or adding content to the Transparency Centre.

We continue to upload our report according to the approved deadlines.

QRE 36.1.2 (for the Commitments 34-36)

Signatories will outline changes to the Transparency Centre's content, operations, or functioning in their reports over time. Such updates can be drafted jointly by Signatories involved in operating or adding content to the Transparency Centre.

The administration of the Transparency Centre website has been transferred fully to the community of the Code’s signatories, with VOST Europe taking the role of developer.

SLI 36.1.1 (for the Commitments 34-36)

Signatories will provide meaningful quantitative information on the usage of the Transparency Centre, such as the average monthly visits of the webpage.

In the period between 01/07/2025 to 31/12/2025, our signatory profile was visited 1,580 times, and our signatory reports were downloaded 9,941 times. The Transparency Centre Webpage overall was visited 30,384 times.

Country
Austria 0
Belgium 0
Bulgaria 0
Croatia 0
Cyprus 0
Czech Republic 0
Denmark 0
Estonia 0
Finland 0
France 0
Germany 0
Greece 0
Hungary 0
Ireland 0
Italy 0
Latvia 0
Lithuania 0
Luxembourg 0
Malta 0
Netherlands 0
Poland 0
Portugal 0
Romania 0
Slovakia 0
Slovenia 0
Spain 0
Sweden 0
Iceland 0
Liechtenstein 0
Norway 0

Permanent Task-Force

Commitment 37

Signatories commit to participate in the permanent Task-force. The Task-force includes the Signatories of the Code and representatives from EDMO and ERGA. It is chaired by the European Commission, and includes representatives of the European External Action Service (EEAS). The Task-force can also invite relevant experts as observers to support its work. Decisions of the Task-force are made by consensus.

We signed up to the following measures of this commitment

Measure 37.1 Measure 37.2 Measure 37.3 Measure 37.4 Measure 37.5 Measure 37.6

In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?

No

If yes, list these implementation measures here

There have been no significant updates since the last submitted report.

Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?

No

If yes, which further implementation measures do you plan to put in place in the next 6 months?

N/A

Measure 37.1

Signatories will participate in the Task-force and contribute to its work. Signatories, in particular smaller or emerging services will contribute to the work of the Task-force proportionate to their resources, size and risk profile. Smaller or emerging services can also agree to pool their resources together and represent each other in the Task-force. The Task-force will meet in plenary sessions as necessary and at least every 6 months, and, where relevant, in subgroups dedicated to specific issues or workstreams.

Facebook

Measure 37.2

Signatories agree to work in the Task-force in particular – but not limited to – on the following tasks: Establishing a risk assessment methodology and a rapid response system to be used in special situations like elections or crises; Cooperate and coordinate their work in special situations like elections or crisis; Agree on the harmonised reporting templates for the implementation of the Code's Commitments and Measures, the refined methodology of the reporting, and the relevant data disclosure for monitoring purposes; Review the quality and effectiveness of the harmonised reporting templates, as well as the formats and methods of data disclosure for monitoring purposes, throughout future monitoring cycles and adapt them, as needed; Contribute to the assessment of the quality and effectiveness of Service Level and Structural Indicators and the data points provided to measure these indicators, as well as their relevant adaptation; Refine, test and adjust Structural Indicators and design mechanisms to measure them at Member State level; Agree, publish and update a list of TTPs employed by malicious actors, and set down baseline elements, objectives and benchmarks for Measures to counter them, in line with the Chapter IV of this Code.

Facebook

Measure 37.3

The Task-force will agree on and define its operating rules, including on the involvement of third-party experts, which will be laid down in a Vademecum drafted by the European Commission in collaboration with the Signatories and agreed on by consensus between the members of the Task-force.

Facebook

Measure 37.4

Signatories agree to set up subgroups dedicated to the specific issues related to the implementation and revision of the Code with the participation of the relevant Signatories.

Facebook

Measure 37.5

When needed, and in any event at least once per year the Task-force organises meetings with relevant stakeholder groups and experts to inform them about the operation of the Code and gather their views related to important developments in the field of Disinformation.

Facebook

Measure 37.6

Signatories agree to notify the rest of the Task-force when a Commitment or Measure would benefit from changes over time as their practices and approaches evolve, in view of technological, societal, market, and legislative developments. Having discussed the changes required, the Relevant Signatories will update their subscription document accordingly and report on the changes in their next report.

Facebook

QRE 37.6.1

Signatories will describe how they engage in the work of the Task-force in the reporting period, including the sub-groups they engaged with.

There have been no significant updates since the last submitted report.


Monitoring of the Code

Commitment 38

The Signatories commit to dedicate adequate financial and human resources and put in place appropriate internal processes to ensure the implementation of their commitments under the Code.

We signed up to the following measures of this commitment

Measure 38.1

In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?

No

If yes, list these implementation measures here

N/A

Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?

No

If yes, which further implementation measures do you plan to put in place in the next 6 months?

As mentioned in our baseline report, our policies benefit from our experience and expertise. 

Measure 38.1

Relevant Signatories will outline the teams and internal processes they have in place, per service, to comply with the Code in order to achieve full coverage across the Member States and the languages of the EU.

Facebook

QRE 38.1.1

Relevant Signatories will outline the teams and internal processes they have in place, per service, to comply with the Code in order to achieve full coverage across the Member States and the languages of the EU.

We invest in combating the spread of harmful content, including misinformation and disinformation, in support of our implementation of the Code.
Teams with expertise in content moderation, operations, policy design, safety, market specialists, data and forensic analysis, stakeholder and partner engagement, threat investigation, cybersecurity and product development all work on these challenges. These teams are distributed globally, and draw from the local expertise of their team members and local partners.


Commitment 39

Signatories commit to provide to the European Commission, within 1 month after the end of the implementation period (6 months after this Code’s signature) the baseline reports as set out in the Preamble.

We signed up to the following measures of this commitment

In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?

Yes

If yes, list these implementation measures here

This report was submitted within the required timeline.

Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?

No

If yes, which further implementation measures do you plan to put in place in the next 6 months?

This report was submitted within the required timeline.

Commitment 40

Signatories commit to provide regular reporting on Service Level Indicators (SLIs) and Qualitative Reporting Elements (QREs). The reports and data provided should allow for a thorough assessment of the extent of the implementation of the Code’s Commitments and Measures by each Signatory, service and at Member State level.

We signed up to the following measures of this commitment

Measure 40.1 Measure 40.2 Measure 40.3 Measure 40.4 Measure 40.5 Measure 40.6

In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?

Yes

If yes, list these implementation measures here

For this report, Facebook. Instagram, WhatsApp and Messenger provided QREs and SLIs across the different chapters 

Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?

Yes

If yes, which further implementation measures do you plan to put in place in the next 6 months?

As mentioned in our baseline report, Facebook, Instagram, WhatsApp and Messenger will continue to provide relevant QREs and SLIs across the chapters of this Code.

Measure 40.1

Relevant Signatories that are Very Large Online Platforms, as defined in the DSA, will report every six-months on the implementation of the Commitments and Measures they signed up to under the Code, including on the relevant QREs and SLIs at service and Member State Level.

Facebook

Measure 40.2

Other Signatories will report yearly on the implementation of the Commitments and Measures taken under the present Code, including on the relevant QREs and SLIs, at service and Member State level.

Facebook

Measure 40.3

Facebook

Measure 40.4

Facebook

Measure 40.5

Facebook

Measure 40.6

Facebook

Commitment 41

Signatories commit to work within the Task-force towards developing Structural Indicators, and publish a first set of them within 9 months from the signature of this Code; and to publish an initial measurement alongside their first full report.

We signed up to the following measures of this commitment

Measure 41.1 Measure 41.2 Measure 41.3

In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?

Yes

If yes, list these implementation measures here

We continue to engage with the Taskforce Monitoring Working Group. 

Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?

Yes

If yes, which further implementation measures do you plan to put in place in the next 6 months?

We continue to engage with the Taskforce monitoring working group. 

Measure 41.1

Within 1 month of signing the Code, Signatories will establish a Working Group to tackle this objective. This working group will be tasked with putting forward data points to be provided by Platform Signatories, and a methodology to measure Structural Indicators on the base of these data points, to be executed by non-Platform Signatories.

Facebook

Measure 41.2

The Working Group will report on its progress to the Task-force on a trimestral basis. It will consult with expert stakeholders including but not limited to EDMO, ERGA, and researchers to inform its work and outputs.

Facebook

Measure 41.3

Facebook

Commitment 42

Relevant Signatories commit to provide, in special situations like elections or crisis, upon request of the European Commission, proportionate and appropriate information and data, including ad-hoc specific reports and specific chapters within the regular monitoring, in accordance with the rapid response system established by the Task-force.

We signed up to the following measures of this commitment

In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?

Yes

If yes, list these implementation measures here

We continue to engage in the Taskforce’s election monitoring and crisis monitoring meetings.

Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?

Yes

If yes, which further implementation measures do you plan to put in place in the next 6 months?

We continue to engage in the Taskforce’s election monitoring and crisis monitoring meetings.

Commitment 43

Relevant Signatories commit to provide, in special situations like elections or crisis, upon request of the European Commission, proportionate and appropriate information and data, including ad-hoc specific reports and specific chapters within the regular monitoring, in accordance with the rapid response system established by the Taskforce.

We signed up to the following measures of this commitment

In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?

Yes

If yes, list these implementation measures here

Facebook, Instagram, WhatsApp and Messenger provided their qualitative and quantitative information in the harmonised template provided.

Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?

Yes

If yes, which further implementation measures do you plan to put in place in the next 6 months?

Facebook, Instagram, WhatsApp and Messenger continue to engage with the Taskforce working group on reporting/monitoring as the template evolves.

Commitment 44

Relevant Signatories commit to provide, in special situations like elections or crisis, upon request of the European Commission, proportionate and appropriate information and data, including ad-hoc specific reports and specific chapters within the regular monitoring, in accordance with the rapid response system established by the Taskforce.

We signed up to the following measures of this commitment

In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?

Yes

If yes, list these implementation measures here

As mentioned in our baseline report, we are taking steps to ensure that, following conversion of the Code into a Code of Conduct under the DSA, relevant Meta services will be undergoing appropriate independent audits under the DSA.

Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?

No

If yes, which further implementation measures do you plan to put in place in the next 6 months?

As mentioned in our baseline report, we are taking steps to ensure that, following conversion of the Code into a Code of Conduct under the DSA, relevant Meta services will be undergoing appropriate independent audits.