NewsGuard

Report March 2026

Submitted

Your organisation description

Advertising

Commitment 1

Relevant signatories participating in ad placements commit to defund the dissemination of disinformation, and improve the policies and systems which determine the eligibility of content to be monetised, the controls for monetisation and ad placement, and the data to report on the accuracy and effectiveness of controls and services around ad placements.

We signed up to the following measures of this commitment

Measure 1.6

In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?

No

If yes, list these implementation measures here

Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?

No

If yes, which further implementation measures do you plan to put in place in the next 6 months?

Empowering Users

Commitment 17

In light of the European Commission's initiatives in the area of media literacy, including the new Digital Education Action Plan, Relevant Signatories commit to continue and strengthen their efforts in the area of media literacy and critical thinking, also with the aim to include vulnerable groups.

We signed up to the following measures of this commitment

Measure 17.2 Measure 17.3

In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?

No

If yes, list these implementation measures here

Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?

No

If yes, which further implementation measures do you plan to put in place in the next 6 months?

Measure 17.2

Relevant Signatories will develop, promote and/or support or continue to run activities to improve media literacy and critical thinking such as campaigns to raise awareness about Disinformation, as well as the TTPs that are being used by malicious actors, among the general public across the European Union, also considering the involvement of vulnerable communities.

QRE 17.2.1

Relevant Signatories will describe the activities they launch or support and the Member States they target and reach. Relevant signatories will further report on actions taken to promote the campaigns to their user base per Member States targeted.

In 2025, NewsGuard participated in numerous media literacy events with journalists, librarians, teachers and citizens on topics ranging from how AI is being used in disinformation campaigns, to spotting unreliable sources. The events took place in several Member States: France, Italy, and Germany.

Such events have included lessons on AI-powered false claims and disinformation for students at Luiss University, LUMSA University, Salerno University, and La Sapienza University in Italy, HAW Hamburg (Germany), the Ecole Normale Supérieure and the ISFJ journalism school (France) and webinars with school teachers and students in Italy.

Throughout the year, NewsGuard was regularly involved in initiatives led by IDMO, the Italian Digital Media Observatory, of which NewsGuard is a member.

Our editors have also spoken at conferences to raise awareness on specific issues related to mis- and disinformation, in several Member States, including Belgium, France, Germany, Ireland, Italy, Spain and Norway. These events included a panel titled "Reinforcing the reliability of information and journalism" organized by the Swiss presidency of the Council of Europe, a keynote on the risks of generative AI at Italy’s AI week; a keynote at the Euro-Mediterranean Youth Summit organized by the European Youth Parliament in Malaga; and a presentation at the yearly JRC DISINFO “Defend European Democracy”workshop, “Weaponisation of AI to Create and Spread Disinformation.”

In 2025, NewsGuard also continued providing its browser extension for free to approximately 270 public libraries in Italy, France, Germany and Slovenia.

SLI 17.2.1

Relevant Signatories report on number of media literacy and awareness raising activities organised and or participated in and will share quantitative information pertinent to show the effects of the campaigns they build or support at the Member State level.

In 2025, NewsGuard participated in 21 media literacy seminars and awareness raising events in France, Italy, and Germany. These events reached a total number of approximately 892 participants, including educators and librarians who in turn could reach hundreds of students and library users. NewsGuard also participated in 41 speaking engagements in Italy, France, Belgium, Ireland, Portugal, and Norway, reaching more than 4,120 attendees.

Country Nr of media literacy/ awareness raising activities organised/ participated in Reach of campaigns Nr of participants Nr of interactions with online assets Nr of participants (etc)
Austria 0 0 0 0 0
Belgium 4 0 260 0 0
Bulgaria 0 0 0 0 0
Croatia 0 0 0 0 0
Cyprus 0 0 0 0 0
Czech Republic 0 0 0 0 0
Denmark 0 0 0 0 0
Estonia 0 0 0 0 0
Finland 0 0 0 0 0
France 13 0 1145 0 0
Germany 1 0 50 0 0
Greece 0 0 0 0 0
Hungary 0 0 0 0 0
Ireland 2 0 130 0 0
Italy 38 0 3219 0 0
Latvia 0 0 0 0 0
Lithuania 0 0 0 0 0
Luxembourg 0 0 0 0 0
Malta 0 0 0 0 0
Netherlands 0 0 0 0 0
Poland 0 0 0 0 0
Portugal 0 0 0 0 0
Romania 0 0 0 0 0
Slovakia 0 0 0 0 0
Slovenia 0 0 0 0 0
Spain 2 0 120 0 0
Sweden 0 0 0 0 0
Iceland 0 0 0 0 0
Liechtenstein 0 0 0 0 0
Norway 1 0 60 0 0

Measure 17.3

For both of the above Measures, and in order to build on the expertise of media literacy experts in the design, implementation, and impact measurement of tools, relevant Signatories will partner or consult with media literacy experts in the EU, including for instance the Commission's Media Literacy Expert Group, ERGA's Media Literacy Action Group, EDMO, its country-specific branches, or relevant Member State universities or organisations that have relevant expertise.

QRE 17.3.1

Relevant Signatories will describe how they involved and partnered with media literacy experts for the purposes of all Measures in this Commitment.

Through the Italian Digital Media Observatory’s portal, NewsGuard regularly makes its content and analyses on disinformation in Italy and in Europe public, contributing to the consortium’s media literacy efforts. NewsGuard has various partnerships and collaborations with research institutions and universities that study disinformation, such as La Sapienza University in Rome, Ca’ Foscari University in Venice, Carlo Bo University in Urbino, University of Salerno, the European University Institute in Florence, the Italian National Research Council, IMT School in Lucca, the Universität der Bundeswehr München and the Max-Planck-Institute in Germany, as well as the French National Research Institute for Digital Science and Technology (Inria).

A 2024 study published in the “Journal of Quantitative Description: Digital Media” examined the use of NewsGuard’s Reliability Ratings in academic research. The paper—authored by researchers from TU Graz, the University of Vienna, the Medical University of Vienna, and RWTH Aachen University—concluded that NewsGuard has become the most widely used and comprehensive dataset in this space, using a rigorous and transparent methodology, without exhibiting any political bias.

Commitment 22

Relevant Signatories commit to provide users with tools to help them make more informed decisions when they encounter online information that may be false or misleading, and to facilitate user access to tools and information to assess the trustworthiness of information sources, such as indicators of trustworthiness for informed online navigation, particularly relating to societal issues or debates of general interest.

We signed up to the following measures of this commitment

Measure 22.4 Measure 22.5

In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?

No

If yes, list these implementation measures here

Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?

No

If yes, which further implementation measures do you plan to put in place in the next 6 months?

Empowering Researchers

Commitment 29

Relevant Signatories commit to conduct research based on transparent methodology and ethical standards, as well as to share datasets, research findings and methodologies with relevant audiences.

We signed up to the following measures of this commitment

Measure 29.1

In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?

No

If yes, list these implementation measures here

Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?

No

If yes, which further implementation measures do you plan to put in place in the next 6 months?

Permanent Task-Force

Commitment 37

Signatories commit to participate in the permanent Task-force. The Task-force includes the Signatories of the Code and representatives from EDMO and ERGA. It is chaired by the European Commission, and includes representatives of the European External Action Service (EEAS). The Task-force can also invite relevant experts as observers to support its work. Decisions of the Task-force are made by consensus.

We signed up to the following measures of this commitment

Measure 37.6

In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?

No

If yes, list these implementation measures here

Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?

No

If yes, which further implementation measures do you plan to put in place in the next 6 months?

Measure 37.6

Signatories agree to notify the rest of the Task-force when a Commitment or Measure would benefit from changes over time as their practices and approaches evolve, in view of technological, societal, market, and legislative developments. Having discussed the changes required, the Relevant Signatories will update their subscription document accordingly and report on the changes in their next report.

QRE 37.6.1

Signatories will describe how they engage in the work of the Task-force in the reporting period, including the sub-groups they engaged with.

NewsGuard has been regularly participating in the meetings of the Code of Practice signatories.

Monitoring of the Code

Commitment 38

The Signatories commit to dedicate adequate financial and human resources and put in place appropriate internal processes to ensure the implementation of their commitments under the Code.

We signed up to the following measures of this commitment

Measure 38.1

In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?

No

If yes, list these implementation measures here

Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?

No

If yes, which further implementation measures do you plan to put in place in the next 6 months?

Measure 38.1

Relevant Signatories will outline the teams and internal processes they have in place, per service, to comply with the Code in order to achieve full coverage across the Member States and the languages of the EU.

QRE 38.1.1

Relevant Signatories will outline the teams and internal processes they have in place, per service, to comply with the Code in order to achieve full coverage across the Member States and the languages of the EU.

Members of NewsGuard’s European team (including Roberta Schmid, Virginia Padovese and Chine Labbé, co-Managing Editors and Senior Vice-Presidents for Europe,) are responsible for implementing and monitoring the company's commitments under the code in Germany, Austria, Italy, and France.

Commitment 39

Signatories commit to provide to the European Commission, within 1 month after the end of the implementation period (6 months after this Code’s signature) the baseline reports as set out in the Preamble.

We signed up to the following measures of this commitment

In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?

If yes, list these implementation measures here

Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?

If yes, which further implementation measures do you plan to put in place in the next 6 months?

Crisis and Elections Response

Elections 2025

[Note: Signatories are requested to provide information relevant to their particular response to the threats and challenges they observed on their service(s). They ensure that the information below provides an accurate and complete report of their relevant actions. As operational responses to crisis/election situations can vary from service to service, an absence of information should not be considered a priori a shortfall in the way a particular service has responded. Impact metrics are accurate to the best of signatories’ abilities to measure them].

Threats observed or anticipated

NewsGuard launched its 2025 German Elections Misinformation Tracking Center on February 24, 2025, to address the barrage of false claims targeting the German snap elections. NewsGuard identified 22 false claims related to the vote, including disinformation spread by Russian actors, and targeting mainstream political parties that support NATO and Ukraine. These false claims aimed at boosting the far-right party Alternative für Deutschland (AfD), but also risked undermining the citizens’ overall trust in government institutions and in the electoral process. NewsGuard also found that bad actors were increasingly employing AI tools, including deepfake technology, to generate convincing false narratives with fabricated testimonies and manipulated videos. U.S. national turned Russian disinformation operative John Mark Dougan was notably responsible for launching and operating 102 fake local news websites powered by AI, and designed to mimic independent local news sites in Germany, with names such as “Berliner Tageblatt” and “Hamburger Anzeiger.”

Mitigations in place

N/A

Scrutiny of Ads Placements

Outline approaches pertinent to this chapter, highlighting similarities/commonalities and differences with regular enforcement.

During several months before and after the February 2025 German Snap Elections, NewsGuard monitored and added to its database new unreliable websites spreading false claims about the German elections and its candidates. In doing so, NewsGuard continued using its transparent and apolitical evaluation process, whose methodology is detailed on its website, with all criteria clearly explained to publishers. Tracking these websites offered advertisers a way to avoid inadvertently supporting websites spreading false claims about the vote, through programmatic advertisements.

Empowering Users

Outline approaches pertinent to this chapter, highlighting similarities/commonalities and differences with regular enforcement.

In 2025, NewsGuard ramped up efforts to identify, rate, and monitor sources of election meddling in Europe, particularly for the German snap elections, but also ahead of the Armenian and Moldovan legislative elections. NewsGuard constantly added new sources of election misinformation to its database, rating them according to its transparent rating system, so that users with access to its browser extension (a
consumer product available to all for a monthly subscription fee) could make informed decisions about which sources to trust, and which ones they should be way of. NewsGuard’s global team of information integrity experts identified 22 myths about the elections spreading across social media, as well as 102 fake local websites pushing them, hailing from a Russian disinformation operation. NewsGuard shared its most salient findings with the general public through its public newsletter and reports.

In a non-crisis situation, NewsGuard’s main editorial promise is to rate all news and information sites that account for 95% of online engagement with news. However, for this specific line of work - just like we do for every crisis situation, and did before for the COVID-19 pandemic, and the Russia-Ukraine war, as described above -, NewsGuard’s analysts went further, looking for any site spreading false claims and disinformation about the conflict in the languages we cover (English, French, Italian and German,) - even those responsible for very little online engagement - and making sure we rated them. We also made sure to track all sources that spread the myths we were uncovering, in order to cover more sources.

Empowering the Research Community

Outline approaches pertinent to this chapter, highlighting similarities/commonalities and differences with regular enforcement.

In 2025, NewsGuard’s analysts participated in 21 media literacy seminars and awareness raising events in France, Italy, and Germany, and also participated in 41 speaking engagements in Italy, France, Belgium, Ireland, and Norway. Most touched on all relevant crises, including the February 2025 German snap elections.

As stated above, throughout the year, NewsGuard’s analysts fed its browser extension with transparent analyses of websites spreading false claims about the vote, and debunked all related myths through its tracking center. The analysts based their ratings - as they always do - on NewsGuard’s transparent, apolitical and independent process, applying equally our nine criteria to all sources.

Crisis 2025

[Note: Signatories are requested to provide information relevant to their particular response to the threats and challenges they observed on their service(s). They ensure that the information below provides an accurate and complete report of their relevant actions. As operational responses to crisis/election situations can vary from service to service, an absence of information should not be considered a priori a shortfall in the way a particular service has responded. Impact metrics are accurate to the best of signatories’ abilities to measure them].

Threats observed or anticipated

1. RUSSIA - UKRAINE CRISIS
Threats observed or anticipated at time of reporting: In 2025 NewsGuard continued updating its “Russia-Ukraine Disinformation Tracking Center,” launched in March 2022,
immediately after Russia started its full-scale invasion. Through our constant monitoring of Russian disinformation in languages includingRussian, English, French, Italian, and
German across different platforms and websites, we reported on how state-affiliated actors were pushing false narratives about Ukraine, but also sowing division and nurturing
anti-war and war fatigue sentiments across Member States and playing up European fears and dissent. As of December 2025, NewsGuard’s Russia-Ukraine Disinformation
Tracking Center had identified 400 false narratives about the war, being spread by 561 websites around the world, including in Italy, France, Germany and Austria, versus 280 false claims at the end of 2024.

2. ISRAEL-IRAN WAR TRACKING CENTRE 2025
NewsGuard launched its 2025 Israel-Iran War Misinformation Tracking Center on June 13, 2025, hours after Israel launched attacks against Tehran's nuclear sites and military
leadership. Iranian state-controlled and affiliated media sources immediately began spreading false claims attempting to portray Israel's attack as a failure and Iran's retaliation as a success. NewsGuard's global team of analysts have identified 26 false claims spreading across social media about this war, and 78 websites advancing the claims, from AI-generated images purporting to show mass destruction in Tel Aviv to false claims about the supposed capturing of Israeli pilots and other personnel. The sources spreading these claims included Iranian military-affiliated Telegram channels as well as official Iran state media sources operating under the Islamic Republic of Iran Broadcasting (IRIB), an Iranian state-owned corporation sanctioned by the U.S. Treasury Department.

3. RISE OF AI-GENERATED CONTENT AND FOREIGN INFLUENCE OF LARGE LANGUAGE MODELS
In 2025, NewsGuard continued to regularly publish AI False Claims Monitors, which measure the propensity for leading AI chatbots (such as ChatGPT, Gemini and Mistral) to
produce false information when prompted with untrue claims and false narratives about the news, including State-sponsored narratives. Using a journalistic method grounded in
rigorously verified data and human expertise, these Monitors measure the trustworthiness of commercial AI tools in relation to the news. NewsGuard analysts identify vulnerabilities in AI systems that result in the spread of false information, allowing developers to strengthen their models and improve their safeguards as usage of the technology increases around the world.

Mitigations in place

N/A

Scrutiny of Ads Placements

Outline approaches pertinent to this chapter, highlighting similarities/commonalities and differences with regular enforcement.

1. RUSSIA - UKRAINE CRISIS
Throughout the year, NewsGuard monitored and added to its database new detailed Reliability Ratings of websites spreading Russian Disinformation. In February 2025, NewsGuard launched flags in our False Claim Fingerprints database to specify whether the myth originated or spread on Russian state-controlled or influenced sources.

NewsGuard also continued to update the “Russia-Ukraine War” metadata field accompanying its Reliability Ratings, to allow brands and advertisers using its BrandGuard services to easily identify these sites and make sure their ad money does not support the Kremlin disinformation machine. In doing so, NewsGuard continued using its transparent and apolitical evaluation process, whose methodology is detailed on its website, with all criteria clearly explained to publishers. NewsGuard also made sure that news publishers being flagged for spreading Russia-Ukraine disinformation were aware of it, and given a right to comment on issues flagged by NewsGuard. NewsGuard also continued offering these websites the possibility to publish a full response to their ratings.

2. ISRAEL-IRAN WAR TRACKING CENTRE 2025
Over the course of several months after the start of the Israel-Iran 2025 War, NewsGuard monitored and added to its database new unreliable websites spreading false claims about the war. In doing so, NewsGuard continued using its transparent and apolitical evaluation process, whose methodology is detailed on its website, with all criteria clearly explained to publishers. Tracking these websites offered advertisers a way to avoid inadvertently funding them through programmatic advertisement.

Empowering Users

Outline approaches pertinent to this chapter, highlighting similarities/commonalities and differences with regular enforcement.

1. RUSSIA - UKRAINE CRISIS
In 2025, NewsGuard continued to closely monitor sources of Russian disinformation within the continent, constantly adding new sources to its Tracking Center, and rating these sources according to its transparent rating system, so that users with access to its browser extension (a consumer product available to all for a monthly subscription fee) could make informed decision about which sources to trust, and which to be wary of, when reading the news online, and as the war became a protracted one.

In a non-crisis situation, NewsGuard’s main editorial promise is to rate all news and information sites that account for 95% of online engagement with news. However, for this specific line of work - just like we do for every crisis situation and did before for the COVID-19 pandemic -, NewsGuard’s analysts went further, looking for any site spreading mis- and disinformation about the war in the languages we cover (English, French, Italian and German,) - even those responsible for very little online engagement - and making sure we rated them. We also made sure to track all sources that spread the myths we were uncovering, in order to cover more sources.

In 2025, NewsGuard’s analysts participated in 21 media literacy seminars and awareness raising events in France, Italy, and Germany, and participated in 41 speaking engagements in Italy, France, Belgium, Ireland and Norway. Most touched on all relevant crises, including the Russia-Ukraine war. Throughout the year, NewsGuard’s analysts fed its browser extension with transparent analyses of sources spreading false claims about the Russia-Ukraine war. The analysts continued basing their ratings - as they always do - on NewsGuard’s transparent, apolitical and independent process, applying equally our nine criteria to all sources.

2. ISRAEL-IRAN WAR TRACKING CENTRE 2025
In 2025, NewsGuard ramped up efforts to identify, rate, and monitor sources of false claims and state-sponsored disinformation, constantly adding new sources to its Tracking Centers, and rating these sources according to its transparent rating system, so that users with access to its browser extension (a consumer product available to all for a monthly subscription fee) could make informed decisions about which sources to trust. NewsGuard’s global team of information integrity experts also regularly shared its findings on the false claims spreading about the war through its public newsletter, Reality Check.

In a non-crisis situation, NewsGuard’s main editorial promise is to rate all news and information sites that account for 95% of online engagement with news. However, for this specific line of work - just like we do for every crisis situation, and did before for the COVID-19 pandemic, and the Russia-Ukraine war, as described above -, NewsGuard’s analysts went further, looking for any site spreading false claims about the conflict in the languages we cover (English, French, Italian and German,) - even those responsible for very little online engagement - and making sure we rated them. We also made sure to track all sources that spread the myths we were uncovering, in order to cover more sources.

In 2025, NewsGuard’s analysts participated in 21 media literacy seminars and awareness raising events in France, Italy, and Germany, and participated in 41 speaking engagements in Italy, France, Belgium, Ireland, and Norway. Most touched on all relevant crises, including the war. As stated above, throughout the year, NewsGuard’s analysts fed its browser extension with transparent analyses of sources spreading false claims about the war, and debunked the claims spreading through its tracking center. The analysts based their ratings - as they always do - on NewsGuard’s transparent, apolitical and independent process, applying equally our nine criteria to all sources.

3. RISE OF AI-GENERATED CONTENT AND FOREIGN INFLUENCE OF LARGE LANGUAGE MODELS
In 2025, NewsGuard’s analysts participated in 21 media literacy seminars and awareness raising events in France, Italy, and Germany, as well as 41 speaking engagements in Italy, France, Belgium, Ireland and Norway. Most touched on all relevant crises, including the rise of AI-generated content, and how large-language models can be weaponized by malign actors into spreading state-sponsored propaganda and false narratives.

For example, in May 2025, one of our team members delivered a keynote at the AI Week in Italy, discussing how generative AI could contribute to Russian disinformation reaching Western news consumers more directly and effectively.

Empowering the Research Community

Outline approaches pertinent to this chapter, highlighting similarities/commonalities and differences with regular enforcement.

1. RUSSIA - UKRAINE CRISIS
In 2025, NewsGuard published three reports on the Russia-Ukraine crisis. The first one, published in February 2025, and titled “Russia’s War on Ukraine: Three Years, Three Hundred and Two False Claims,” explored the then-302 provably false claims NewsGuard had identified and debunked since the start of the war. In December 2025, a second report titled “400 and Counting: A Russian Influence Operation Overtakes Official State Media in Spreading Russia-Ukraine False Claims,” expanded on NewsGuard’s earlier findings, and found that secret, anonymous Russian influence operations had surpassed official state media as the biggest source of false narratives in 2025. In March 2025, NewsGuard published “A Well-funded Moscow-based Global ‘News’ Network has Infected Western Artificial Intelligence Tools Worldwide with Russian Propaganda,” a widely cited report which found that the 10 leading commercial generative AI tools contributed to Moscow’s disinformation goals
by repeating false claims from the pro-Kremlin Pravda network 33 percent of the time.

2. ISRAEL-IRAN WAR TRACKING CENTRE 2025
In 2025, NewsGuard sent regular briefings to its clients — including researchers — and consumers, on the war, on top of updating its Tracking Center.

3. RISE OF AI-GENERATED CONTENT AND FOREIGN INFLUENCE OF LARGE LANGUAGE MODELS
In 2025, NewsGuard published 8 AI False Claim Monitors, including one (in January 2025) that audited the chatbots in multiple Member State languages, namely French, German, Italian and Spanish. In its reports, NewsGuard described the research methodology of the analysis. All these reports were published on NewsGuard’s website, where they still are available.