Report March 2026
Your organisation description
Integrity of Services
Commitment 14
In order to limit impermissible manipulative behaviours and practices across their services, Relevant Signatories commit to put in place or further bolster policies to address both misinformation and disinformation across their services, and to agree on a cross-service understanding of manipulative behaviours, actors and practices not permitted on their services. Such behaviours and practices include: The creation and use of fake accounts, account takeovers and bot-driven amplification, Hack-and-leak operations, Impersonation, Malicious deep fakes, The purchase of fake engagements, Non-transparent paid messages or promotion by influencers, The creation and use of accounts that participate in coordinated inauthentic behaviour, User conduct aimed at artificially amplifying the reach or perceived public support for disinformation.
We signed up to the following measures of this commitment
Measure 14.1 Measure 14.2 Measure 14.3
In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?
If yes, list these implementation measures here
Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?
If yes, which further implementation measures do you plan to put in place in the next 6 months?
Measure 14.1
Relevant Signatories will adopt, reinforce and implement clear policies regarding impermissible manipulative behaviours and practices on their services, based on the latest evidence on the conducts and tactics, techniques and procedures (TTPs) employed by malicious actors, such as the AMITT Disinformation Tactics, Techniques and Procedures Framework.
QRE 14.1.1
Relevant Signatories will list relevant policies and clarify how they relate to the threats mentioned above as well as to other Disinformation threats.
YouTube continues to assess, evaluate, and update its policies on a regular basis, the latest updated policies, including Community Guidelines, can be found here.
QRE 14.1.2
Signatories will report on their proactive efforts to detect impermissible content, behaviours, TTPs and practices relevant to this commitment.
Measure 14.2
Relevant Signatories will keep a detailed, up-to-date list of their publicly available policies that clarifies behaviours and practices that are prohibited on their services and will outline in their reports how their respective policies and their implementation address the above set of TTPs, threats and harms as well as other relevant threats.
QRE 14.2.1
Relevant Signatories will report on actions taken to implement the policies they list in their reports and covering the range of TTPs identified/employed, at the Member State level.
SLI 14.2.1
Number of instances of identified TTPs and actions taken at the Member State level under policies addressing each of the TTPs as well as information on the type of content.
TTP 1
(1) Number of channels identified and removed for TTP 1 during the reporting period, broken down by EEA Member State.
(2) Number of channels identified and removed for TTP 5 during the reporting period, broken down by EEA Member State;
(4) Number of videos identified and removed for TTP 7 during the reporting period, broken down by EEA Member State.
(5) Number of channels identified and removed for TTP 9 during the reporting period, broken down by EEA Member State;
| Country | TTP OR ACTION 1 - Number of channels identified | TTP OR ACTION 1 - Number of channels removed | TTP OR ACTION 5 - Number of channels identified | TTP OR ACTION 5 - Number of channels removed | TTP OR ACTION 5 - Number of videos identified | TTP OR ACTION 5 - Number of videos removed | TTP OR ACTION 7 - Number of videos identified | TTP OR ACTION 7 - Number of videos removed | TTP OR ACTION 9 - Number of channels identified | TTP OR ACTION 9 - Number of channels removed | TTP OR ACTION 9 - Number of videos identified | TTP OR ACTION 9 - Number of videos removed |
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| Austria | 1,676 | 188 | 1 | 14 | 36 | 1 | ||||||
| Belgium | 1,507 | 395 | 822 | 12 | 47 | 9 | ||||||
| Bulgaria | 1,688 | 313 | 3 | 11 | 32 | 2 | ||||||
| Croatia | 367 | 121 | 4 | 4 | 14 | 0 | ||||||
| Cyprus | 76,212 | 98 | 111 | 6 | 25 | 24 | ||||||
| Czech Republic | 3,576 | 326 | 363 | 23 | 93 | 7 | ||||||
| Denmark | 1,954 | 151 | 1 | 3 | 26 | 6 | ||||||
| Estonia | 402 | 48 | 2 | 2 | 9 | 6 | ||||||
| Finland | 34,451 | 148 | 56 | 8 | 37 | 7 | ||||||
| France | 28,462 | 1,964 | 268 | 101 | 310 | 27 | ||||||
| Germany | 70,558 | 2,200 | 998 | 194 | 615 | 186 | ||||||
| Greece | 1,508 | 257 | 10 | 12 | 30 | 1 | ||||||
| Hungary | 602 | 209 | 33 | 5 | 26 | 2 | ||||||
| Ireland | 1,549 | 210 | 936 | 26 | 31 | 1 | ||||||
| Italy | 16,274 | 1,655 | 571 | 42 | 143 | 10 | ||||||
| Latvia | 7581 | 65 | 0 | 4 | 30 | 3 | ||||||
| Lithuania | 1,271 | 86 | 268 | 2 | 22 | 3 | ||||||
| Luxembourg | 277 | 17 | 3 | 2 | 4 | 0 | ||||||
| Malta | 167 | 14 | 0 | 2 | 6 | 0 | ||||||
| Netherlands | 38,549 | 666 | 254 | 117 | 298 | 115 | ||||||
| Poland | 28,648 | 1,134 | 108 | 56 | 235 | 18 | ||||||
| Portugal | 925 | 320 | 26 | 13 | 32 | 2 | ||||||
| Romania | 4,525 | 887 | 98 | 18 | 86 | 6 | ||||||
| Slovakia | 579 | 140 | 2 | 3 | 22 | 0 | ||||||
| Slovenia | 153 | 48 | 0 | 0 | 13 | 1 | ||||||
| Spain | 8,053 | 1,269 | 2,309 | 72 | 197 | 12 | ||||||
| Sweden | 4,240 | 379 | 69 | 27 | 62 | 2 | ||||||
| Iceland | 136 | 12 | 1 | 0 | 2 | 0 | ||||||
| Liechtenstein | 8 | 1 | 0 | 0 | 1 | 0 | ||||||
| Norway | 1,036 | 235 | 19 | 5 | 40 | 1 | ||||||
| Total EU | 335,754 | 13,308 | 7,316 | 779 | 2,481 | 451 | ||||||
| Total EEA | 336,934 | 13,556 | 7,336 | 784 | 2,524 | 452 |
SLI 14.2.2
Views/impressions of and interaction/engagement at the Member State level (e.g. likes, shares, comments), related to each identified TTP, before and after action was taken.
TTP 5
(1) Views threshold on video removals for TTP 5 during the reporting period, broken down by EEA Member State.
(2) Views threshold on video removals for TTP 7 during the reporting period, broken down by EEA Member State.
(3) Views threshold on video removals for TTP 9 during the reporting period, broken down by EEA Member State.
Actions in this context constitute removals of the video themselves. And therefore there should be no views, actions, or engagement after YouTube removes the content.
| Country | TTP OR ACTION 5 - Number of videos removed with 0 views | TTP OR ACTION 5 - Number of videos removed with 1-10 views | TTP OR ACTION 5 - Number of videos removed with 11-100 views | TTP OR ACTION 5 - Number of videos removed with 101-1,000 views | TTP OR ACTION 5 - Number of videos removed with 1,001- 10,000 views | TTP OR ACTION 5 - Number of videos removed with >10,000 views | TTP OR ACTION 5 - Views after action | TTP OR ACTION 7 - Number of videos removed with 0 views | TTP OR ACTION 7 - Number of videos removed with 1-10 views | TTP OR ACTION 7 - Number of videos removed with 11-100 views | TTP OR ACTION 7 - Number of videos removed with 101-1,000 views | TTP OR ACTION 7 - Number of videos removed with 1,001- 10,000 views | TTP OR ACTION 7 - Number of videos removed with >10,000 views | TTP OR ACTION 7 - Views after action | TTP OR ACTION 9 - Number of videos removed with 0 views | TTP OR ACTION 9 - Number of videos removed with 1-10 views | TTP OR ACTION 9 - Number of videos removed with 11-100 views | TTP OR ACTION 9 - Number of videos removed with 101-1,000 views | TTP OR ACTION 9 - Number of videos removed with 1,001- 10,000 views | TTP OR ACTION 9 - Number of videos removed with >10,000 views | TTP OR ACTION 9 - Views after action |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| Austria | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 5 | 3 | 4 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | |||
| Belgium | 11 | 1 | 3 | 719 | 84 | 4 | 1 | 4 | 2 | 2 | 2 | 1 | 0 | 0 | 0 | 2 | 3 | 4 | |||
| Bulgaria | 1 | 0 | 2 | 0 | 0 | 0 | 0 | 4 | 2 | 4 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | |||
| Croatia | 0 | 1 | 0 | 0 | 3 | 0 | 0 | 1 | 1 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | |||
| Cyprus | 20 | 10 | 56 | 24 | 1 | 0 | 1 | 3 | 0 | 2 | 0 | 0 | 0 | 0 | 3 | 1 | 15 | 5 | |||
| Czech Republic | 120 | 25 | 110 | 98 | 10 | 0 | 4 | 9 | 4 | 4 | 1 | 1 | 1 | 0 | 0 | 2 | 2 | 2 | |||
| Denmark | 1 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 1 | 0 | 0 | 0 | 2 | 2 | 1 | 1 | |||
| Estonia | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 0 | 2 | 0 | 2 | 2 | 0 | |||
| Finland | 15 | 0 | 15 | 15 | 9 | 2 | 0 | 1 | 3 | 2 | 2 | 0 | 1 | 0 | 1 | 4 | 0 | 1 | |||
| France | 54 | 34 | 65 | 82 | 30 | 3 | 13 | 44 | 16 | 17 | 7 | 4 | 2 | 0 | 3 | 2 | 5 | 15 | |||
| Germany | 146 | 51 | 197 | 286 | 287 | 31 | 22 | 70 | 35 | 27 | 26 | 14 | 3 | 2 | 5 | 50 | 87 | 39 | |||
| Greece | 4 | 1 | 1 | 1 | 2 | 1 | 1 | 3 | 2 | 3 | 2 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | |||
| Hungary | 16 | 0 | 4 | 12 | 1 | 0 | 4 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | |||
| Ireland | 5 | 4 | 15 | 161 | 621 | 130 | 0 | 12 | 4 | 4 | 4 | 2 | 0 | 0 | 0 | 1 | 0 | 0 | |||
| Italy | 56 | 33 | 171 | 202 | 75 | 34 | 4 | 20 | 9 | 6 | 2 | 1 | 0 | 2 | 1 | 4 | 1 | 2 | |||
| Latvia | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | 2 | 0 | 1 | 0 | 1 | 1 | 0 | |||
| Lithuania | 0 | 0 | 2 | 0 | 9 | 257 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 1 | 0 | 2 | 0 | |||
| Luxembourg | 1 | 0 | 0 | 0 | 0 | 2 | 0 | 1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | |||
| Malta | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 2 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | |||
| Netherlands | 26 | 34 | 45 | 84 | 36 | 29 | 19 | 44 | 24 | 18 | 4 | 8 | 4 | 2 | 6 | 19 | 63 | 21 | |||
| Poland | 15 | 14 | 38 | 27 | 10 | 4 | 9 | 18 | 10 | 8 | 5 | 6 | 1 | 3 | 1 | 2 | 4 | 7 | |||
| Portugal | 9 | 1 | 2 | 14 | 0 | 0 | 2 | 5 | 2 | 3 | 1 | 0 | 0 | 0 | 0 | 1 | 0 | 1 | |||
| Romania | 14 | 24 | 13 | 5 | 20 | 22 | 5 | 6 | 4 | 2 | 0 | 1 | 0 | 0 | 2 | 0 | 2 | 2 | |||
| Slovakia | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 2 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | |||
| Slovenia | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | |||
| Spain | 93 | 143 | 815 | 905 | 263 | 90 | 13 | 22 | 12 | 13 | 5 | 7 | 0 | 0 | 1 | 0 | 5 | 6 | |||
| Sweden | 22 | 3 | 13 | 21 | 8 | 2 | 7 | 13 | 4 | 1 | 2 | 0 | 0 | 1 | 0 | 1 | 0 | 0 | |||
| Iceland | 0 | 0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | |||
| Liechtenstein | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | |||
| Norway | 8 | 0 | 1 | 9 | 1 | 0 | 0 | 1 | 3 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 1 | 0 | |||
| Total EU | 630 | 380 | 1,567 | 2,657 | 1,470 | 612 | 107 | 289 | 142 | 124 | 67 | 50 | 13 | 14 | 26 | 95 | 197 | 106 | |||
| Total EEA | 638 | 380 | 1,568 | 2,667 | 1,471 | 612 | 107 | 290 | 145 | 125 | 67 | 50 | 13 | 14 | 26 | 95 | 198 | 106 |
SLI 14.2.3
Metrics to estimate the penetration and impact that e.g. Fake/Inauthentic accounts have on genuine users and report at the Member State level (including trends on audiences targeted; narratives used etc.).
TTP 5
Refer to SLI 14.2.2, which provides data on video removals by view threshold and view / impressions on the platform after action has been taken. Views are a measure of penetration / impact on the platform.
TTP 7
Refer to SLI 14.2.2, which provides data on video removals by view threshold and view / impressions on the platform after action has been taken. Views are a measure of penetration / impact on the platform.
TTP 9
Refer to SLI 14.2.2, which provides data on video removals by view threshold and view / impressions on the platform after action has been taken. Views are a measure of penetration / impact on the platform.
| Country | TTP OR ACTION1 - Penetration and impact on genuine users | TTP OR ACTION1 - Trends on targeted audiences | TTP OR ACTION1 - Trends on narratives used | TTP OR ACTION2 - Penetration and impact on genuine users | TTP OR ACTION2 - Trends on targeted audiences | TTP OR ACTION2 - Trends on narratives used | TTP OR ACTION3 - Penetration and impact on genuine users | TTP OR ACTION3 - Trends on targeted audiences | TTP OR ACTION3 - Trends on narratives used | TTP OR ACTION4 - Penetration and impact on genuine users | TTP OR ACTION4 - Trends on targeted audiences | TTP OR ACTION4 - Trends on narratives used | TTP OR ACTION5 - Penetration and impact on genuine users | TTP OR ACTION5 - Trends on targeted audiences | TTP OR ACTION5 - Trends on narratives used | TTP OR ACTION6 - Penetration and impact on genuine users | TTP OR ACTION6 - Trends on targeted audiences | TTP OR ACTION6 - Trends on narratives used | TTP OR ACTION7 - Penetration and impact on genuine users | TTP OR ACTION7 - Trends on targeted audiences | TTP OR ACTION7 - Trends on narratives used | TTP OR ACTION8 - Penetration and impact on genuine users | TTP OR ACTION8 - Trends on targeted audiences | TTP OR ACTION8 - Trends on narratives used | TTP OR ACTION9 - Penetration and impact on genuine users | TTP OR ACTION9 - Trends on targeted audiences | TTP OR ACTION9 - Trends on narratives used | TTP OR ACTION10 - Penetration and impact on genuine users | TTP OR ACTION10 - Trends on targeted audiences | TTP OR ACTION10 - Trends on narratives used | TTP OR ACTION11 - Penetration and impact on genuine users | TTP OR ACTION11 - Trends on targeted audiences | TTP OR ACTION11 - Trends on narratives used | TTP OR ACTION12 - Penetration and impact on genuine users | TTP OR ACTION12 - Trends on targeted audiences | TTP OR ACTION12 - Trends on narratives used |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| Austria | ||||||||||||||||||||||||||||||||||||
| Belgium | ||||||||||||||||||||||||||||||||||||
| Bulgaria | ||||||||||||||||||||||||||||||||||||
| Croatia | ||||||||||||||||||||||||||||||||||||
| Cyprus | ||||||||||||||||||||||||||||||||||||
| Czech Republic | ||||||||||||||||||||||||||||||||||||
| Denmark | ||||||||||||||||||||||||||||||||||||
| Estonia | ||||||||||||||||||||||||||||||||||||
| Finland | ||||||||||||||||||||||||||||||||||||
| France | ||||||||||||||||||||||||||||||||||||
| Germany | ||||||||||||||||||||||||||||||||||||
| Greece | ||||||||||||||||||||||||||||||||||||
| Hungary | ||||||||||||||||||||||||||||||||||||
| Ireland | ||||||||||||||||||||||||||||||||||||
| Italy | ||||||||||||||||||||||||||||||||||||
| Latvia | ||||||||||||||||||||||||||||||||||||
| Lithuania | ||||||||||||||||||||||||||||||||||||
| Luxembourg | ||||||||||||||||||||||||||||||||||||
| Malta | ||||||||||||||||||||||||||||||||||||
| Netherlands | ||||||||||||||||||||||||||||||||||||
| Poland | ||||||||||||||||||||||||||||||||||||
| Portugal | ||||||||||||||||||||||||||||||||||||
| Romania | ||||||||||||||||||||||||||||||||||||
| Slovakia | ||||||||||||||||||||||||||||||||||||
| Slovenia | ||||||||||||||||||||||||||||||||||||
| Spain | ||||||||||||||||||||||||||||||||||||
| Sweden | ||||||||||||||||||||||||||||||||||||
| Iceland | ||||||||||||||||||||||||||||||||||||
| Liechtenstein | ||||||||||||||||||||||||||||||||||||
| Norway |
SLI 14.2.4
Estimation, at the Member State level, of TTPs related content, views/impressions and interaction/engagement with such content as a percentage of the total content, views/impressions and interaction/engagement on relevant signatories' service.
TTP 1
(1) Percentage of TTP 1 channel removals out of all related channel removals during the reporting period, broken down by EEA Member State.
Refer to the Community Guidelines enforcement report for more information regarding removed violative content.
TTP 5
(2) Percentage of TTP 5 channel removals out of all related channel removals during the reporting period, broken down by EEA Member State;
Refer to the Community Guidelines enforcement report for more information regarding removed violative content.
TTP 7
(4) Percentage of TTP 7 video removals out of all related video removals during the reporting period, broken down by EEA Member State.
Refer to the Community Guidelines enforcement report for more information regarding removed violative videos.
TTP 9
(5) Percentage of TTP 9 channel removals out of all related channel removals during the reporting period, broken down by EEA Member State;
Refer to the Community Guidelines enforcement report for more information regarding removed violative videos.
| Country | TTP OR ACTION 1 - Percentage of TTP 1 channel removals out of all related channel removals | TTP OR ACTION 5 - Percentage of TTP 5 channel removals out of all related channel removals | TTP OR ACTION 5 - Percentage of TTP 5 video removals out of all related video removals | TTP OR ACTION 7 - Percentage of TTP 7 video removals out of all related video removals | TTP OR ACTION 9 - Percentage of TTP 9 channel removals out of all related channel removals | TTP OR ACTION 9 - Percentage of TTP 9 video removals out of all related video removals |
|---|---|---|---|---|---|---|
| Austria | 15.20% | 1.70% | 0.01% | 0.09% | 0.33% | 0.01% |
| Belgium | 30.69% | 8.04% | 3.13% | 0.05% | 0.96% | 0.03% |
| Bulgaria | 20.07% | 3.72% | 0.01% | 0.04% | 0.38% | 0.01% |
| Croatia | 17.95% | 5.92% | 0.06% | 0.06% | 0.68% | 0.00% |
| Cyprus | 59.74% | 0.08% | 1.33% | 0.07% | 0.02% | 0.29% |
| Czech Republic | 36.74% | 3.35% | 0.71% | 0.04% | 0.96% | 0.01% |
| Denmark | 39.77% | 3.07% | 0.01% | 0.02% | 0.53% | 0.04% |
| Estonia | 15.04% | 1.80% | 0.03% | 0.03% | 0.34% | 0.10% |
| Finland | 31.86% | 0.14% | 0.47% | 0.07% | 0.03% | 0.06% |
| France | 19.52% | 1.35% | 0.18% | 0.07% | 0.21% | 0.02% |
| Germany | 19.82% | 0.62% | 0.59% | 0.11% | 0.17% | 0.11% |
| Greece | 32.65% | 5.57% | 0.06% | 0.07% | 0.65% | 0.01% |
| Hungary | 25.17% | 8.74% | 0.17% | 0.03% | 1.09% | 0.01% |
| Ireland | 26.39% | 3.58% | 4.44% | 0.12% | 0.53% | 0.00% |
| Italy | 41.07% | 4.18% | 0.50% | 0.04% | 0.36% | 0.01% |
| Latvia | 24.57% | 0.21% | 0.00% | 0.04% | 0.10% | 0.03% |
| Lithuania | 33.39% | 2.26% | 2.52% | 0.02% | 0.58% | 0.03% |
| Luxembourg | 17.98% | 1.10% | 0.24% | 0.16% | 0.26% | 0.00% |
| Malta | 13.14% | 1.10% | 0.00% | 0.12% | 0.47% | 0.00% |
| Netherlands | 20.53% | 0.35% | 0.30% | 0.14% | 0.16% | 0.13% |
| Poland | 37.77% | 1.50% | 0.12% | 0.06% | 0.31% | 0.02% |
| Portugal | 15.35% | 5.31% | 0.08% | 0.04% | 0.53% | 0.01% |
| Romania | 38.00% | 7.45% | 0.11% | 0.02% | 0.72% | 0.01% |
| Slovakia | 26.56% | 6.42% | 0.01% | 0.02% | 1.01% | 0.00% |
| Slovenia | 16.76% | 5.26% | 0.00% | 0.00% | 1.42% | 0.03% |
| Spain | 22.06% | 3.48% | 1.72% | 0.05% | 0.54% | 0.01% |
| Sweden | 20.35% | 1.82% | 0.26% | 0.10% | 0.30% | 0.01% |
| Iceland | 24.55% | 2.17% | 0.10% | 0.00% | 0.36% | 0.00% |
| Liechtenstein | 9.09% | 1.14% | 0.00% | 0.00% | 1.14% | 0.00% |
| Norway | 31.20% | 7.08% | 0.11% | 0.03% | 1.20% | 0.01% |
| Total EU | 27.67% | 1.10% | 0.63% | 0.07% | 0.20% | 0.04% |
| Total EEA | 27.68% | 1.11% | 0.62% | 0.07% | 0.21% | 0.04% |
Measure 14.3
Relevant Signatories will convene via the Permanent Task-force to agree upon and publish a list and terminology of TTPs employed by malicious actors, which should be updated on an annual basis.
QRE 14.3.1
Signatories will report on the list of TTPs agreed in the Permanent Task-force within 6 months of the signing of the Code and will update this list at least every year. They will also report about the common baseline elements, objectives and benchmarks for the policies and measures.
The final list of TTPs agreed within the Permanent Task-force in H2 2022 was used by Signatories as part of their reports from then on, as intended. The Permanent Task-force will continue to examine and update the list as necessary in light of technical advancements and evolving disinformation tactics.
Commitment 15
Relevant Signatories that develop or operate AI systems and that disseminate AI-generated and manipulated content through their services (e.g. deepfakes) commit to take into consideration the transparency obligations and the list of manipulative practices prohibited under the proposal for Artificial Intelligence Act.
We signed up to the following measures of this commitment
Measure 15.1 Measure 15.2
In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?
If yes, list these implementation measures here
Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?
If yes, which further implementation measures do you plan to put in place in the next 6 months?
Measure 15.1
Relevant signatories will establish or confirm their policies in place for countering prohibited manipulative practices for AI systems that generate or manipulate content, such as warning users and proactively detect such content.
QRE 15.1.1
In line with EU and national legislation, Relevant Signatories will report on their policies in place for countering prohibited manipulative practices for AI systems that generate or manipulate content.
- Spam, Deceptive Practices, and Scams that prohibit, for example, spam, scams, and other deceptive practices that take advantage of the YouTube community;
- Impersonation;
- Fake Engagement.
Measure 15.2
Relevant Signatories will establish or confirm their policies in place to ensure that the algorithms used for detection, moderation and sanctioning of impermissible conduct and content on their services are trustworthy, respect the rights of end-users and do not constitute prohibited manipulative practices impermissibly distorting their behaviour in line with Union and Member States legislation.
QRE 15.2.1
Relevant Signatories will report on their policies and actions to ensure that the algorithms used for detection, moderation and sanctioning of impermissible conduct and content on their services are trustworthy, respect the rights of end-users and do not constitute prohibited manipulative practices in line with Union and Member States legislation.
Commitment 16
Relevant Signatories commit to operate channels of exchange between their relevant teams in order to proactively share information about cross-platform influence operations, foreign interference in information space and relevant incidents that emerge on their respective services, with the aim of preventing dissemination and resurgence on other services, in full compliance with privacy legislation and with due consideration for security and human rights risks.
We signed up to the following measures of this commitment
Measure 16.1 Measure 16.2
In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?
If yes, list these implementation measures here
Google’s Threat Intelligence Group (GTIG) published its Q3 2025, and Q4 2025 Quarterly Bulletin, which provides updates around coordinated influence operation campaigns terminated on Google’s platforms.
Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?
If yes, which further implementation measures do you plan to put in place in the next 6 months?
Measure 16.1
Relevant Signatories will share relevant information about cross-platform information manipulation, foreign interference in information space and incidents that emerge on their respective services for instance via a dedicated sub-group of the permanent Task-force or via existing fora for exchanging such information.
QRE 16.1.1
Relevant Signatories will disclose the fora they use for information sharing as well as information about learnings derived from this sharing.
See Google’s disclosure policies about handling security vulnerabilities for developers and security professionals.
SLI 16.1.1
Number of actions taken as a result of the collaboration and information sharing between signatories. Where they have such information, they will specify which Member States that were affected (including information about the content being detected and acted upon due to this collaboration).
YouTube
The publicly available TAG Bulletins that were published for the reporting period show:
- The number of actions taken on YouTube channels involved in Coordinated Influence Operation Campaigns.
- The languages of the uploaded content that were part of campaigns.
- Brief descriptions of the campaigns.
- Instances when industry partners supported YouTube’s actions by providing leads.
Certain campaigns may have uploaded content in multiple languages, or in other countries outside of the EEA region utilising EEA languages. Please note that there may be many languages for any one coordinated influence campaign and that the presence of content in an EEA Member State language does not necessarily entail a particular focus on that Member State.
The TAG Bulletin and periodic blog posts are Google’s, including YouTube’s, primary public source of information on coordinated influence operations and TTP-related issues.
The EU Code of Conduct on Disinformation Rapid Response System (RRS) is a collaborative initiative involving both non-platform and platform Signatories of the Code of Conduct to provide a means for cooperation and communication between them for a period of time ahead, during and after the election period.
The RRS allows non-platform Signatories of the Code of Conduct to report time-sensitive content or accounts that they deem may present serious or systemic concerns to the integrity of the electoral process, and enables discussion with the platform Signatories in light of their respective policies.
The disclosures below also include reporting through the RRS of allegedly illegal content. Although the Article 16 Digital Services Act (DSA) mechanism should be used by non-platform Signatories to report allegedly illegal content, Google reviews such notifications, too, as part of the RRS, provided the non-platform Signatory has already used the Article 16 DSA mechanism to submit them and shares the appropriate notification reference with Google through the RRS.
Search
- Czech Republic - no notifications were received through RRS.
- Ireland - no notifications were received through RRS.
- The Netherlands - no notifications were received through RRS.
- Portugal - no notifications were received through RRS.
- Moldova - no notifications were received through RRS.
- Czech Republic - no notifications were received through RRS.
- Ireland - 31 notifications were received through RRS;
- 24 flags were found to be non-violative;
- 7 flags led to the removal of content or accounts.
- The Netherlands - no notifications were received through RRS.
- Portugal - no notifications were received through RRS.
- Moldova - 142 notifications were received through RRS;
- 86 flags were found to be non-violative;
- 56 flags led to the removal of content or accounts.
Measure 16.2
Relevant Signatories will pay specific attention to and share information on the tactical migration of known actors of misinformation, disinformation and information manipulation across different platforms as a way to circumvent moderation policies, engage different audiences or coordinate action on platforms with less scrutiny and policy bandwidth.
QRE 16.2.1
As a result of the collaboration and information sharing between them, Relevant Signatories will share qualitative examples and case studies of migration tactics employed and advertised by such actors on their platforms as observed by their moderation team and/or external partners from Academia or fact-checking organisations engaged in such monitoring.
The most recent examples of specific tactics, techniques, and procedures (TTPs) used to lure victims, as well as how Google collaborates and shares information, can be found in Google’s TAG Blog and Threat Intelligence website.
Empowering Users
Commitment 17
In light of the European Commission's initiatives in the area of media literacy, including the new Digital Education Action Plan, Relevant Signatories commit to continue and strengthen their efforts in the area of media literacy and critical thinking, also with the aim to include vulnerable groups.
We signed up to the following measures of this commitment
Measure 17.1 Measure 17.2 Measure 17.3
In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?
If yes, list these implementation measures here
Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?
If yes, which further implementation measures do you plan to put in place in the next 6 months?
Measure 17.1
Relevant Signatories will design and implement or continue to maintain tools to improve media literacy and critical thinking, for instance by empowering users with context on the content visible on services or with guidance on how to evaluate online content.
QRE 17.1.1
Relevant Signatories will outline the tools they develop or maintain that are relevant to this commitment and report on their deployment in each Member State.
SLI 17.1.1
Relevant Signatories will report, at the Member State level, on metrics pertinent to assessing the effects of the tools described in the qualitative reporting element for Measure 17.1, which will include: the total count of impressions of the tool; and information on the interactions/engagement with the tool.
| Country | Impressions of information panels | Impressions on labels indicating altered or synthetic content |
|---|---|---|
| Austria | 35,436,900 | 97,074,600 |
| Belgium | 139,752,900 | 99,382,200 |
| Bulgaria | 52,956,700 | 61,140,600 |
| Croatia | 52,579,800 | 41,299,300 |
| Cyprus | 4,353,000 | 15,044,500 |
| Czech Republic | 117,762,400 | 108,281,500 |
| Denmark | 18,210,000 | 50,458,400 |
| Estonia | 16,565,900 | 19,797,300 |
| Finland | 16,550,200 | 55,616,500 |
| France | 860,771,700 | 700,025,700 |
| Germany | 1,817,072,400 | 1,053,238,700 |
| Greece | 28,401,500 | 86,025,500 |
| Hungary | 44,661,000 | 54,791,100 |
| Ireland | 66,024,000 | 88,340,500 |
| Italy | 355,847,700 | 692,892,600 |
| Latvia | 46,580,400 | 38,949,600 |
| Lithuania | 49,898,800 | 37,585,600 |
| Luxembourg | 2,629,900 | 7,255,000 |
| Malta | 2,572,500 | 6,504,700 |
| Netherlands | 457,391,700 | 325,040,900 |
| Poland | 199,452,000 | 490,280,400 |
| Portugal | 23,057,900 | 120,349,000 |
| Romania | 89,976,900 | 163,318,700 |
| Slovakia | 23,796,300 | 38,073,100 |
| Slovenia | 15,106,100 | 18,283,300 |
| Spain | 353,322,900 | 752,917,000 |
| Sweden | 94,865,900 | 112,427,500 |
| Iceland | 1,098,900 | 4,603,500 |
| Liechtenstein | 181,600 | 462,300 |
| Norway | 23,963,100 | 76,312,900 |
| Total EU | 4,985,597,400 | 5,334,393,800 |
| Total EEA | 5,010,841,000 | 5,415,772,500 |
Measure 17.2
Relevant Signatories will develop, promote and/or support or continue to run activities to improve media literacy and critical thinking such as campaigns to raise awareness about Disinformation, as well as the TTPs that are being used by malicious actors, among the general public across the European Union, also considering the involvement of vulnerable communities.
QRE 17.2.1
Relevant Signatories will describe the activities they launch or support and the Member States they target and reach. Relevant signatories will further report on actions taken to promote the campaigns to their user base per Member States targeted.
SLI 17.2.1
Relevant Signatories report on number of media literacy and awareness raising activities organised and or participated in and will share quantitative information pertinent to show the effects of the campaigns they build or support at the Member State level.
| Country | Impressions from YouTube's media literacy campaigns |
|---|---|
| Austria | 3,107,142 |
| Belgium | 2,038,464 |
| Bulgaria | 2,506,221 |
| Croatia | 1,941,695 |
| Cyprus | 200,595 |
| Czech Republic | 4,772,598 |
| Denmark | 1,832,197 |
| Estonia | 251,623 |
| Finland | 1,853,365 |
| France | 28,332,778 |
| Germany | 28,667,395 |
| Greece | 4,168,868 |
| Hungary | 4,028,295 |
| Ireland | 1,854,044 |
| Italy | 22,611,323 |
| Latvia | 410,984 |
| Lithuania | 976,023 |
| Luxembourg | 197,541 |
| Malta | 409,677 |
| Netherlands | 6,223,229 |
| Poland | 16,142,521 |
| Portugal | 4,463,198 |
| Romania | 7,302,666 |
| Slovakia | 2,024,229 |
| Slovenia | 687,317 |
| Spain | 23,417,687 |
| Sweden | 3,712,435 |
| Iceland | 239,963 |
| Liechtenstein | 23,369 |
| Norway | 1,559,768 |
| Total EU | 174,134,110 |
| Total EEA | 175,957,210 |
Measure 17.3
For both of the above Measures, and in order to build on the expertise of media literacy experts in the design, implementation, and impact measurement of tools, relevant Signatories will partner or consult with media literacy experts in the EU, including for instance the Commission's Media Literacy Expert Group, ERGA's Media Literacy Action Group, EDMO, its country-specific branches, or relevant Member State universities or organisations that have relevant expertise.
QRE 17.3.1
Relevant Signatories will describe how they involved and partnered with media literacy experts for the purposes of all Measures in this Commitment.
Commitment 18
Relevant Signatories commit to minimise the risks of viral propagation of Disinformation by adopting safe design practices as they develop their systems, policies, and features.
We signed up to the following measures of this commitment
Measure 18.2 Measure 18.3
In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?
If yes, list these implementation measures here
Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?
If yes, which further implementation measures do you plan to put in place in the next 6 months?
Measure 18.2
Relevant Signatories will develop and enforce publicly documented, proportionate policies to limit the spread of harmful false or misleading information (as depends on the service, such as prohibiting, downranking, or not recommending harmful false or misleading information, adapted to the severity of the impacts and with due regard to freedom of expression and information); and take action on webpages or actors that persistently violate these policies.
QRE 18.2.1
Relevant Signatories will report on the policies or terms of service that are relevant to Measure 18.2 and on their approach towards persistent violations of these policies.
SLI 18.2.1
Relevant Signatories will report on actions taken in response to violations of policies relevant to Measure 18.2, at the Member State level. The metrics shall include: Total number of violations and Meaningful metrics to measure the impact of these actions (such as their impact on the visibility of or the engagement with content that was actioned upon).
| Country | Number of videos removed | Number of videos removed with 0 views | Number of videos removed with 1-10 views | Number of videos removed with 11-100 views | Number of videos removed with 101-1,000 views | Number of videos removed with 1,001- 10,000 views | Number of videos removed with >10,000 views |
|---|---|---|---|---|---|---|---|
| Austria | 85 | 5 | 18 | 17 | 23 | 14 | 8 |
| Belgium | 34 | 4 | 11 | 10 | 5 | 3 | 1 |
| Bulgaria | 48 | 18 | 10 | 4 | 7 | 8 | 1 |
| Croatia | 7 | 0 | 3 | 2 | 2 | 0 | 0 |
| Cyprus | 17 | 3 | 5 | 1 | 4 | 2 | 2 |
| Czech Republic | 55 | 10 | 18 | 10 | 9 | 5 | 3 |
| Denmark | 12 | 1 | 7 | 0 | 2 | 1 | 1 |
| Estonia | 78 | 1 | 3 | 3 | 8 | 63 | 0 |
| Finland | 22 | 3 | 9 | 4 | 2 | 4 | 0 |
| France | 235 | 29 | 69 | 55 | 41 | 29 | 12 |
| Germany | 799 | 89 | 188 | 161 | 171 | 125 | 65 |
| Greece | 24 | 2 | 5 | 3 | 4 | 6 | 4 |
| Hungary | 16 | 4 | 2 | 4 | 4 | 2 | 0 |
| Ireland | 84 | 17 | 25 | 18 | 13 | 8 | 3 |
| Italy | 76 | 8 | 29 | 18 | 11 | 5 | 5 |
| Latvia | 34 | 3 | 7 | 5 | 10 | 7 | 2 |
| Lithuania | 17 | 1 | 3 | 5 | 3 | 3 | 2 |
| Luxembourg | 3 | 0 | 2 | 0 | 1 | 0 | 0 |
| Malta | 2 | 0 | 2 | 0 | 0 | 0 | 0 |
| Netherlands | 216 | 35 | 82 | 44 | 32 | 8 | 15 |
| Poland | 115 | 26 | 33 | 19 | 15 | 13 | 9 |
| Portugal | 50 | 7 | 14 | 13 | 12 | 4 | 0 |
| Romania | 36 | 7 | 12 | 7 | 4 | 3 | 3 |
| Slovakia | 9 | 4 | 2 | 2 | 0 | 1 | 0 |
| Slovenia | 15 | 4 | 2 | 1 | 4 | 4 | 0 |
| Spain | 648 | 76 | 154 | 149 | 145 | 85 | 39 |
| Sweden | 52 | 10 | 20 | 10 | 7 | 5 | 0 |
| Iceland | 1 | 0 | 0 | 0 | 0 | 0 | 1 |
| Liechtenstein | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
| Norway | 30 | 15 | 6 | 5 | 3 | 1 | 0 |
| Total EU | 2,789 | 367 | 735 | 565 | 539 | 408 | 175 |
| Total EEA | 2,820 | 382 | 741 | 570 | 542 | 409 | 176 |
Measure 18.3
Relevant Signatories will invest and/or participate in research efforts on the spread of harmful Disinformation online and related safe design practices, will make findings available to the public or report on those to the Code's taskforce. They will disclose and discuss findings within the permanent Task-force, and explain how they intend to use these findings to improve existing safe design practices and features or develop new ones.
QRE 18.3.1
Relevant Signatories will describe research efforts, both in-house and in partnership with third-party organisations, on the spread of harmful Disinformation online and relevant safe design practices, as well as actions or changes as a result of this research. Relevant Signatories will include where possible information on financial investments in said research. Wherever possible, they will make their findings available to the general public.
Commitment 19
Relevant Signatories using recommender systems commit to make them transparent to the recipients regarding the main criteria and parameters used for prioritising or deprioritising information, and provide options to users about recommender systems, and make available information on those options.
We signed up to the following measures of this commitment
Measure 19.1 Measure 19.2
In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?
If yes, list these implementation measures here
Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?
If yes, which further implementation measures do you plan to put in place in the next 6 months?
Measure 19.1
Relevant Signatories will make available to their users, including through the Transparency Centre and in their terms and conditions, in a clear, accessible and easily comprehensible manner, information outlining the main parameters their recommender systems employ.
QRE 19.1.1
Relevant Signatories will provide details of the policies and measures put in place to implement the above-mentioned measures accessible to EU users, especially by publishing information outlining the main parameters their recommender systems employ in this regard. This information should also be included in the Transparency Centre.
- Homepage: A user’s homepage is what they typically see when they first open YouTube.
- Up Next: The Up Next panel appears when a user is watching a video. It suggests additional content based on what they are currently watching and personalised signals (details below).
- Shorts: Shorts are ranked based on their performance and personalisation.
- Watch history: YouTube’s system uses the videos a user watches to give better recommendations, remember where a user left off, and more.
- Search history: YouTube’s system uses what a user searches for on YouTube to influence future recommendations.
- Channel subscriptions: YouTube’s system uses information about the channels a user subscribes to in order to recommend videos they may like.
- Likes: YouTube’s system uses a user’s likes information to try to predict the likelihood that they will be interested in similar videos in the future.
- Dislikes: YouTube’s system uses videos a user dislikes to inform what to avoid recommending in the future.
- 'Not interested' feedback selections: YouTube’s system uses videos a user marks as 'Not interested' to inform what to avoid recommending in the future.
- 'Don’t recommend channel' feedback selections: YouTube’s system uses 'Don’t recommend channel' feedback selections as a signal that the channel content likely is not something a user enjoyed watching.
Additional information about how a user can manage their recommendation settings are outlined here in YouTube’s Help Centre.
Measure 19.2
Relevant Signatories will provide options for the recipients of the service to select and to modify at any time their preferred options for relevant recommender systems, including giving users transparency about those options.
SLI 19.2.1
Relevant Signatories will provide aggregated information on effective user settings, such as the number of times users have actively engaged with these settings within the reporting period or over a sample representative timeframe, and clearly denote shifts in configuration patterns.
| Country | |
|---|---|
| Austria | |
| Belgium | |
| Bulgaria | |
| Croatia | |
| Cyprus | |
| Czech Republic | |
| Denmark | |
| Estonia | |
| Finland | |
| France | |
| Germany | |
| Greece | |
| Hungary | |
| Ireland | |
| Italy | |
| Latvia | |
| Lithuania | |
| Luxembourg | |
| Malta | |
| Netherlands | |
| Poland | |
| Portugal | |
| Romania | |
| Slovakia | |
| Slovenia | |
| Spain | |
| Sweden | |
| Iceland | |
| Liechtenstein | |
| Norway |
Commitment 22
Relevant Signatories commit to provide users with tools to help them make more informed decisions when they encounter online information that may be false or misleading, and to facilitate user access to tools and information to assess the trustworthiness of information sources, such as indicators of trustworthiness for informed online navigation, particularly relating to societal issues or debates of general interest.
We signed up to the following measures of this commitment
Measure 22.7
In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?
If yes, list these implementation measures here
Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?
If yes, which further implementation measures do you plan to put in place in the next 6 months?
Measure 22.7
Relevant Signatories will design and apply products and features (e.g. information panels, banners, pop-ups, maps and prompts, trustworthiness indicators) that lead users to authoritative sources on topics of particular public and societal interest or in crisis situations.
QRE 22.7.1
Relevant Signatories will outline the products and features they deploy across their services and will specify whether those are available across Member States.
- Panels on topics prone to misinformation: Topics that are prone to misinformation, such as the moon landing, may display an information panel at the top of search results or under a video. These information panels show basic background information, sourced from independent, third-party partners, to give more context on a topic. The panels also link to the third-party partner’s website. YouTube continues to assess and update the topics prone to misinformation that receive additional context from information panels. More details found here.
- Election information panels: The election-related features are only available in select countries/regions during election cycles. Users may see candidate information panels, voting information panels, election integrity information panels, or election results information panels. More details found here.
SLI 22.7.1
Relevant Signatories will report on the reach and/or user interactions with the products or features, at the Member State level, via the metrics of impressions and interactions (clicks, click-through rates (as relevant to the tools and services in question) and shares (as relevant to the tools and services in question).
| Country | ||
|---|---|---|
| Austria | ||
| Belgium | ||
| Bulgaria | ||
| Croatia | ||
| Cyprus | ||
| Czech Republic | ||
| Denmark | ||
| Estonia | ||
| Finland | ||
| France | ||
| Germany | ||
| Greece | ||
| Hungary | ||
| Ireland | ||
| Italy | ||
| Latvia | ||
| Lithuania | ||
| Luxembourg | ||
| Malta | ||
| Netherlands | ||
| Poland | ||
| Portugal | ||
| Romania | ||
| Slovakia | ||
| Slovenia | ||
| Spain | ||
| Sweden | ||
| Iceland | ||
| Liechtenstein | ||
| Norway | ||
| Total EU | ||
| Total EEA |
Commitment 23
Relevant Signatories commit to provide users with the functionality to flag harmful false and/or misleading information that violates Signatories policies or terms of service.
We signed up to the following measures of this commitment
Measure 23.1 Measure 23.2
In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?
If yes, list these implementation measures here
Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?
If yes, which further implementation measures do you plan to put in place in the next 6 months?
Measure 23.1
Relevant Signatories will develop or continue to make available on all their services and in all Member States languages in which their services are provided a user-friendly functionality for users to flag harmful false and/or misleading information that violates Signatories' policies or terms of service. The functionality should lead to appropriate, proportionate and consistent follow-up actions, in full respect of the freedom of expression.
QRE 23.1.1
Relevant Signatories will report on the availability of flagging systems for their policies related to harmful false and/or misleading information across EU Member States and specify the different steps that are required to trigger the systems.
Measure 23.2
Relevant Signatories will take the necessary measures to ensure that this functionality is duly protected from human or machine-based abuse (e.g., the tactic of 'mass-flagging' to silence other voices).
QRE 23.2.1
Relevant Signatories will report on the general measures they take to ensure the integrity of their reporting and appeals systems, while steering clear of disclosing information that would help would-be abusers find and exploit vulnerabilities in their defences.
- Having a dedicated team to identify and mitigate the impact of sophisticated bad actors on YouTube at scale, while protecting the broader community;
- Partnering with Google’s Threat Intelligence Group (GTIG) and Trust & Safety Teams to monitor malicious actors around the globe, disable their accounts, and remove the content that they post (See QRE 16.1.1 and QRE 16.2.1);
- Legal protections, such as those found in the Digital Services Act;
- Educating users about Community Guidelines violations through its guided policy experience;
- Providing clear communication on appeals processes and notifications, and regular policy updates on its Help Centre; and,
- Investing in automated systems to provide efficient detection of content to be evaluated by human reviewers.
Commitment 24
Relevant Signatories commit to inform users whose content or accounts has been subject to enforcement actions (content/accounts labelled, demoted or otherwise enforced on) taken on the basis of violation of policies relevant to this section (as outlined in Measure 18.2), and provide them with the possibility to appeal against the enforcement action at issue and to handle complaints in a timely, diligent, transparent, and objective manner and to reverse the action without undue delay where the complaint is deemed to be founded.
We signed up to the following measures of this commitment
Measure 24.1
In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?
If yes, list these implementation measures here
Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?
If yes, which further implementation measures do you plan to put in place in the next 6 months?
Measure 24.1
Relevant Signatories commit to provide users with information on why particular content or accounts have been labelled, demoted, or otherwise enforced on, on the basis of violation of policies relevant to this section, as well as the basis for such enforcement action, and the possibility for them to appeal through a transparent mechanism.
QRE 24.1.1
Relevant Signatories will report on the availability of their notification and appeals systems across Member States and languages and provide details on the steps of the appeals procedure.
- Appeal a Community Guidelines strike;
- Appeal a Community Guidelines video removal;
- Appeal the age restriction of a video;
- Appeal playlist or thumbnail removals;
- Appeal a channel termination.
After a creator submits an appeal
- If YouTube finds that a user’s content followed YouTube’s Community Guidelines, YouTube will reinstate it and remove the strike from their channel. If a user appeals a warning and the appeal is granted, the next offence will be a warning.
- If YouTube finds that a user’s content followed YouTube’s Community Guidelines, but is not appropriate for all audiences, YouTube will apply an age-restriction. If it is a video, it will not be visible to users who are signed out, are under 18 years of age, or have Restricted Mode turned on. If it is a custom thumbnail, it will be removed.
- If YouTube finds that a user’s content was in violation of YouTube’s Community Guidelines, the strike will stay and the video will remain down from the site. There is no additional penalty for appeals that are rejected.
For more information about YouTube’s median time needed to action a complaint, please see the latest VLOSE/VLOP Transparency Report under the European Union Digital Services Act (EU DSA).
SLI 24.1.1
Relevant Signatories provide information on the number and nature of enforcement actions for policies described in response to Measure 18.2, the numbers of such actions that were subsequently appealed, the results of these appeals, information, and to the extent possible metrics, providing insight into the duration or effectiveness of processing of appeals process, and publish this information on the Transparency Centre.
| Country | Number of videos removed that were subsequently appealed | Number of videos removed that were then reinstated following a creator's appeal |
|---|---|---|
| Austria | 22 | 5 |
| Belgium | 12 | 4 |
| Bulgaria | 11 | 0 |
| Croatia | 0 | 0 |
| Cyprus | 9 | 1 |
| Czech Republic | 12 | 5 |
| Denmark | 1 | 1 |
| Estonia | 14 | 1 |
| Finland | 9 | 0 |
| France | 98 | 38 |
| Germany | 202 | 54 |
| Greece | 8 | 1 |
| Hungary | 6 | 2 |
| Ireland | 40 | 13 |
| Italy | 60 | 39 |
| Latvia | 11 | 2 |
| Lithuania | 4 | 0 |
| Luxembourg | 1 | 0 |
| Malta | 0 | 0 |
| Netherlands | 55 | 22 |
| Poland | 51 | 13 |
| Portugal | 11 | 2 |
| Romania | 19 | 9 |
| Slovakia | 4 | 3 |
| Slovenia | 5 | 2 |
| Spain | 133 | 36 |
| Sweden | 10 | 2 |
| Iceland | 0 | 0 |
| Liechtenstein | 0 | 0 |
| Norway | 7 | 5 |
| Total EU | 808 | 255 |
| Total EEA | 815 | 260 |
Empowering Researchers
Commitment 26
Relevant Signatories commit to provide access, wherever safe and practicable, to continuous, real-time or near real-time, searchable stable access to non-personal data and anonymised, aggregated, or manifestly-made public data for research purposes on Disinformation through automated means such as APIs or other open and accessible technical solutions allowing the analysis of said data.
We signed up to the following measures of this commitment
Measure 26.1 Measure 26.2 Measure 26.3
In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?
If yes, list these implementation measures here
Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?
If yes, which further implementation measures do you plan to put in place in the next 6 months?
Measure 26.1
Relevant Signatories will provide public access to non-personal data and anonymised, aggregated or manifestly-made public data pertinent to undertaking research on Disinformation on their services, such as engagement and impressions (views) of content hosted by their services, with reasonable safeguards to address risks of abuse (e.g. API policies prohibiting malicious or commercial uses).
QRE 26.1.1
Relevant Signatories will describe the tools and processes in place to provide public access to non-personal data and anonymised, aggregated and manifestly-made public data pertinent to undertaking research on Disinformation, as well as the safeguards in place to address risks of abuse.
QRE 26.1.2
Relevant Signatories will publish information related to data points available via Measure 26.1, as well as details regarding the technical protocols to be used to access these data points, in the relevant help centre. This information should also be reachable from the Transparency Centre. At minimum, this information will include definitions of the data points available, technical and methodological information about how they were created, and information about the representativeness of the data.
- Search: Access to an API for limited scraping with a budget for quota;
- YouTube: Permission for scraping limited to metadata.
- Paid product placements
- Videos about a product or service because there is a connection between the creator and the maker of the product or service;
- Videos created for a company or business in exchange for compensation or free of charge products/services;
- Videos where that company or business’s brand, message, or product is included directly in the content and the company has given the creator money or free of charge products to make the video.
- Endorsements - Videos created for an advertiser or marketer that contains a message that reflects the opinions, beliefs, or experiences of the creator.
- Sponsorships - Videos that have been financed in whole or in part by a company, without integrating the brand, message, or product directly into the content. Sponsorships generally promote the brand, message, or product of the third party.
SLI 26.1.1
Relevant Signatories will provide quantitative information on the uptake of the tools and processes described in Measure 26.1, such as number of users.
- Researchers accessing the Researcher Program API during the reporting period may have been approved before the reporting period. There can be more than one researcher per application.
| Country | Number of unique researchers accessing the YouTube Researcher API |
|---|---|
| Austria | 2 |
| Belgium | 2 |
| Bulgaria | 0 |
| Croatia | 0 |
| Cyprus | 0 |
| Czech Republic | 1 |
| Denmark | 2 |
| Estonia | 0 |
| Finland | 2 |
| France | 5 |
| Germany | 16 |
| Greece | 0 |
| Hungary | 0 |
| Ireland | 0 |
| Italy | 8 |
| Latvia | 0 |
| Lithuania | 0 |
| Luxembourg | 0 |
| Malta | 0 |
| Netherlands | 4 |
| Poland | 0 |
| Portugal | 0 |
| Romania | 1 |
| Slovakia | 0 |
| Slovenia | 0 |
| Spain | 26 |
| Sweden | 0 |
| Iceland | 0 |
| Liechtenstein | 0 |
| Norway | 0 |
| Total EU | 69 |
| Total EEA | 69 |
Measure 26.2
Relevant Signatories will provide real-time or near real-time, machine-readable access to non-personal data and anonymised, aggregated or manifestly-made public data on their service for research purposes, such as accounts belonging to public figures such as elected official, news outlets and government accounts subject to an application process which is not overly cumbersome.
QRE 26.2.1
Relevant Signatories will describe the tools and processes in place to provide real-time or near real-time access to non-personal data and anonymised, aggregated and manifestly-made public data for research purposes as described in Measure 26.2.
QRE 26.2.2
Relevant Signatories will describe the scope of manifestly-made public data as applicable to their services.
QRE 26.2.3
Relevant Signatories will describe the application process in place to in order to gain the access to non-personal data and anonymised, aggregated and manifestly-made public data described in Measure 26.2.
- Review and confirm the applicant’s eligibility;
- Submit an application, which requires a Google account;
- If approved, the applicant gains permission to access public data relevant to their research.
- YouTube verifies the applicant is an academic researcher affiliated with an accredited, higher-learning institution;
- The Researcher creates an API project in the Google Cloud Console and enables the relevant YouTube APIs. They can learn more by visiting the enabled APIs page;
- The Researcher applies with their institutional email (e.g. with a .edu suffix), includes as much detail as possible, and confirms that all of their information is accurate.
SLI 26.2.1
Relevant Signatories will provide meaningful metrics on the uptake, swiftness, and acceptance level of the tools and processes in Measure 26.2, such as: Number of monthly users (or users over a sample representative timeframe), Number of applications received, rejected, and accepted (over a reporting period or a sample representative timeframe), Average response time (over a reporting period or a sample representative timeframe).
- Cells with '0' under applications received signify that there were no applications submitted by a researcher from that country. Similarly, cells with '0' signify that there were no applications approved, rejected, or under review for that country.
- Median Application Resolution time is the median number of days from application creation to application resolution, which may include communication back and forth with the applicant. This metric does not reflect YouTube’s first response back to the applicant.
| Country | Applications Received | Applications Approved | Applications Rejected | Applications under Review | Median application resolution time | |
|---|---|---|---|---|---|---|
| Austria | 2 | 2 | 0 | 0 | - | |
| Belgium | 0 | 0 | 0 | 0 | - | |
| Bulgaria | 0 | 0 | 0 | 0 | - | |
| Croatia | 0 | 0 | 0 | 0 | - | |
| Cyprus | 0 | 0 | 0 | 0 | - | |
| Czech Republic | 3 | 2 | 1 | 0 | - | |
| Denmark | 2 | 2 | 0 | 0 | - | |
| Estonia | 0 | 0 | 0 | 0 | - | |
| Finland | 3 | 3 | 0 | 0 | - | |
| France | 0 | 0 | 0 | 0 | - | |
| Germany | 10 | 8 | 1 | 1 | - | |
| Greece | 0 | 0 | 0 | 0 | - | |
| Hungary | 0 | 0 | 0 | 0 | - | |
| Ireland | 1 | 0 | 1 | 0 | - | |
| Italy | 2 | 2 | 0 | 0 | - | |
| Latvia | 0 | 0 | 0 | 0 | - | |
| Lithuania | 0 | 0 | 0 | 0 | - | |
| Luxembourg | 0 | 0 | 0 | 0 | - | |
| Malta | 0 | 0 | 0 | 0 | - | |
| Netherlands | 4 | 4 | 0 | 0 | - | |
| Poland | 1 | 0 | 1 | 0 | - | |
| Portugal | 0 | 0 | 0 | 0 | - | |
| Romania | 0 | 0 | 0 | 0 | - | |
| Slovakia | 0 | 0 | 0 | 0 | - | |
| Slovenia | 0 | 0 | 0 | 0 | - | |
| Spain | 50 | 31 | 18 | 1 | - | |
| Sweden | 1 | 1 | 0 | 0 | - | |
| Iceland | 0 | 0 | 0 | 0 | - | |
| Liechtenstein | 0 | 0 | 0 | 0 | - | |
| Norway | 0 | 0 | 0 | 0 | - | |
| Total EU | 79 | 55 | 22 | 2 | 10.0 Days | |
| Total EEA | 79 | 55 | 22 | 2 | 10.0 Days |
Measure 26.3
Relevant Signatories will implement procedures for reporting the malfunctioning of access systems and for restoring access and repairing faulty functionalities in a reasonable time.
QRE 26.3.1
Relevant Signatories will describe the reporting procedures in place to comply with Measure 26.3 and provide information about their malfunction response procedure, as well as about malfunctions that would have prevented the use of the systems described above during the reporting period and how long it took to remediate them.
Commitment 28
COOPERATION WITH RESEARCHERS Relevant Signatories commit to support good faith research into Disinformation that involves their services.
We signed up to the following measures of this commitment
Measure 28.1 Measure 28.2 Measure 28.3 Measure 28.4
In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?
If yes, list these implementation measures here
- In October 2025, Google announced the recipients of the 2025 Google Academic Research Awards (GARA), committing $5.6 million to support 56 projects led by 84 researchers across 12 countries. Each recipient received up to $100,000 USD in funding and is paired with a Google research sponsor.
Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?
If yes, which further implementation measures do you plan to put in place in the next 6 months?
Measure 28.1
Relevant Signatories will ensure they have the appropriate human resources in place in order to facilitate research, and should set-up and maintain an open dialogue with researchers to keep track of the types of data that are likely to be in demand for research and to help researchers find relevant contact points in their organisations.
QRE 28.1.1
Relevant Signatories will describe the resources and processes they deploy to facilitate research and engage with the research community, including e.g. dedicated teams, tools, help centres, programs, or events.
Eligible EU researchers can apply for access to publicly available data across some of Google’s products, including Search and YouTube, through the Google Researcher Program. Search and YouTube will provide eligible researchers (including non-academics that meet predefined eligibility criteria) with access to limited metadata scraping for public data. This program aims to enhance the public’s understanding of Google’s services and their impact.
- YouTube provides a contact email alias to researchers who have been granted access to the program;
- YouTube API Code Samples at GitHub.
Measure 28.2
Relevant Signatories will be transparent on the data types they currently make available to researchers across Europe.
QRE 28.2.1
Relevant Signatories will describe what data types European researchers can currently access via their APIs or via dedicated teams, tools, help centres, programs, or events.
Measure 28.3
Relevant Signatories will not prohibit or discourage genuinely and demonstratively public interest good faith research into Disinformation on their platforms, and will not take adversarial action against researcher users or accounts that undertake or participate in good-faith research into Disinformation.
QRE 28.3.1
Relevant Signatories will collaborate with EDMO to run an annual consultation of European researchers to assess whether they have experienced adversarial actions or are otherwise prohibited or discouraged to run such research.
Measure 28.4
As part of the cooperation framework between the Signatories and the European research community, relevant Signatories will, with the assistance of the EDMO, make funds available for research on Disinformation, for researchers to independently manage and to define scientific priorities and transparent allocation procedures based on scientific merit.
QRE 28.4.1
Relevant Signatories will disclose the resources made available for the purposes of Measure 28.4 and procedures put in place to ensure the resources are independently managed.
Crisis and Elections Response
Elections 2025
[Note: Signatories are requested to provide information relevant to their particular response to the threats and challenges they observed on their service(s). They ensure that the information below provides an accurate and complete report of their relevant actions. As operational responses to crisis/election situations can vary from service to service, an absence of information should not be considered a priori a shortfall in the way a particular service has responded. Impact metrics are accurate to the best of signatories’ abilities to measure them].
Threats observed or anticipated
Mitigations in place
- Enforcing Google policies and using AI models to fight abuse at scale: Google has long-standing policies that inform how it approaches areas like manipulated media, hate and harassment, and incitement to violence — along with policies around demonstrably false claims that could undermine democratic processes, for example in YouTube’s Community Guidelines. To help enforce Google policies, Google’s AI models are enhancing its abuse-fighting efforts. With recent advances in Google’s Large Language Models (LLMs), Google is building faster and more adaptable enforcement systems that enable us to remain nimble and take action even more quickly when new threats emerge.
- Working with the wider ecosystem: Since Google’s inaugural commitment of €25 million to help launch the European Media & Information Fund, an effort designed to strengthen media literacy and information quality across Europe, 121 projects have been funded across 28 countries so far.
- Ads disclosures: Google expanded its Political Content Policies in November 2023 to require advertisers to disclose when their election ads include synthetic content that inauthentically depicts real or realistic-looking people or events. Google’s ads policies already prohibit the use of manipulated media to mislead people, like deep fakes or doctored content. In September 2025, Google updated the Political Content Policies restricting political advertising in the European Union.
- Content labels on YouTube: YouTube’s Misinformation Policies prohibit technically manipulated content that misleads users and could pose a serious risk of egregious harm — and YouTube requires creators to disclose when they have created realistic altered or synthetic content, and will display a label that indicates for people when the content they are watching is synthetic. For sensitive content, including election related content, that contains realistic altered or synthetic material, the label appears on the video itself and in the video description.
- Provide users with additional context: 'About This Image' in Search helps people assess the credibility and context of images found online. We continue looking at ways to integrate integrity signals more directly throughout the Search experience, with a view to enhancing user experience and providing users with the context needed to make informed decisions about the information they see online. For example, we are looking at embedding image provenance into Google Search features in order to enable users to check image provenance more seamlessly.
- Industry collaboration: Google is a member of the Coalition for Content Provenance and Authenticity (C2PA) and standard, a cross-industry effort to help provide more transparency and context for people on AI-generated content.
- High-quality Information on YouTube: For news and information related to elections, YouTube’s systems prominently surface high-quality content, on the YouTube homepage, in search results and the ‘Up Next’ panel. YouTube also displays information panels at the top of search results and below videos to provide additional context. For example, YouTube may surface various election information panels above search results or on videos related to election candidates, parties or voting.
- Ongoing transparency on Election Ads: Starting September 2025, Google restricted political advertising in the European Union under new regulations. Since mid-August 2025, advertisers have been asked to declare if they intend to run political advertising. EU Election Ads previously shown in the Political Ads Transparency Report will remain publicly accessible in the Ads Transparency Centre, subject to retention policies.
- Security tools for campaign and election teams: Google offers free services like its Advanced Protection Program — Google’s strongest set of cyber protections — and Project Shield, which provides unlimited protection against Distributed Denial of Service (DDoS) attacks. Google also partners with Possible, The International Foundation for Electoral Systems (IFES) and Deutschland sicher im Netz (DSIN) to scale account security training and to provide security tools including Titan Security Keys, which defend against phishing attacks and prevent bad actors from accessing users’ Google Accounts.
- Tackling coordinated influence operations: Google’s Threat Intelligence Group (GTIG) helps identify, monitor and tackle emerging threats, ranging from coordinated influence operations to cyber espionage campaigns against high-risk entities. Google reports on actions taken in its quarterly bulletin, and meets regularly with government officials and others in the industry to share threat information and suspected election interference. Mandiant also helps organisations build holistic election security programs and harden their defences with comprehensive solutions, services and tools, including proactive exposure management, proactive intelligence threat hunts, cyber crisis communication services and threat intelligence tracking of information operations. A recent publication from the team gives an overview of the global election cybersecurity landscape, designed to help election organisations tackle a range of potential threats.
Policies and Terms and Conditions
Outline any changes to your policies
Policy - 50.1.1
Changes (such as newly introduced policies, edits, adaptation in scope or implementation) - 50.1.2
Rationale - 50.1.3
Integrity of Services
Outline approaches pertinent to this chapter, highlighting similarities/commonalities and differences with regular enforcement.
Specific Action applied - 50.4.1
Description of intervention - 50.4.2
Indication of impact - 50.4.3
Specific Action applied - 50.4.4
Description of intervention - 50.4.5
Indication of impact - 50.4.6
Empowering Users
Outline approaches pertinent to this chapter, highlighting similarities/commonalities and differences with regular enforcement.
Specific Action applied - 50.5.1
Description of intervention - 50.5.2
YouTube’s Top News and Breaking News shelves surface at the top of search results, prominently featuring content from high-quality news sources, which may include information about EU elections.
Indication of impact - 50.5.3
Specific Action applied - 50.5.4
Description of intervention - 50.5.5
Indication of impact - 50.5.6
Empowering the Research Community
Outline approaches pertinent to this chapter, highlighting similarities/commonalities and differences with regular enforcement.
Specific Action applied - 50.6.1
Description of intervention - 50.6.2
Indication of impact - 50.6.3
Crisis 2025
[Note: Signatories are requested to provide information relevant to their particular response to the threats and challenges they observed on their service(s). They ensure that the information below provides an accurate and complete report of their relevant actions. As operational responses to crisis/election situations can vary from service to service, an absence of information should not be considered a priori a shortfall in the way a particular service has responded. Impact metrics are accurate to the best of signatories’ abilities to measure them].
Threats observed or anticipated
War in Ukraine
- Continued online services manipulation and coordinated influence operations;
- Advertising and monetisation linked to state-backed Russia and Ukraine disinformation;
- Threats to security and protection of digital infrastructure.
Israel-Gaza conflict
- Humanitarian and relief efforts;
- Platforms and partnerships to protect our services from coordinated influence operations, hate speech, and graphic and terrorist content.
Mitigations in place
War in Ukraine
- Elevate access to high-quality information across Google services;
- Protect Google users from harmful disinformation;
- Continue to monitor and disrupt cyber threats;
- Explore ways to provide assistance to support the affected areas more broadly.
Israel-Gaza conflict
- Natal- Israel Trauma and Resiliency Centre: In the early days of the war, calls to Natal’s support hotline went from around 300 a day to 8,000 a day. With our funding, they were able to scale their support to patients by 450%, including multidisciplinary treatment and mental & psychosocial support to direct and indirect victims of trauma due to terror and war in Israel.
- [See two-year detailed report] After more than two years and thanks to Google’s support, International Medical Corps continues to deliver lifesaving health and humanitarian services across Gaza. In addition to the two field hospitals they have been operating in Dier al Balah and Al Zawaida, they announced that they opened a third field hospital in Gaza City in November 2025, significantly expanding access to critical care for civilians in the north. As of late Jan 2026, International Medical Corps has:
- Provided 533,119 outpatient consultations;
- Performed more than 19,771 surgeries;
- Supported 9,238 deliveries, including 1,930 caesarean sections;
- Screened 154,473 children under 5 and pregnant and lactating women for malnutrition; and much more.
Policies and Terms and Conditions
Outline any changes to your policies
Policy - 51.1.1
Changes (such as newly introduced policies, edits, adaptation in scope or implementation) - 51.1.2
Rationale - 51.1.3
Policy - 51.1.4
Changes (such as newly introduced policies, edits, adaptation in scope or implementation) - 51.1.5
Rationale - 51.1.6
Integrity of Services
Outline approaches pertinent to this chapter, highlighting similarities/commonalities and differences with regular enforcement.
Specific Action applied - 51.4.1
Description of intervention - 51.4.2
Indication of impact - 51.4.3
- Removed over 160,000 videos and over 12,000 channels.
- Blocked over 5.9 million videos and over 1,000 channels.
Since June 2025, YouTube’s enforcement continues within its standard enforcement systems, which detect violations of its content policies, including those pertaining to misinformation, hate speech, and graphic violence. This data can be found in the Removal section of YouTube's Community Guidelines Transparency Report.
Specific Action applied - 51.4.4
Description of intervention - 51.4.5
- Per YouTube’s Hate Speech Policy, content that promotes violence or hatred against groups based on their ethnicity, nationality, race or religion is not allowed on YouTube. This includes Jewish, Muslim, and other religious or ethnic communities.
- Per YouTube’s Violent Extremist Policy, content that praises, promotes or in any way aids violent criminal organisations is prohibited. Additionally, content produced by designated terrorist organisations, such as a Foreign Terrorist Organisation (U.S.), or organisation identified by the United Nations, is not allowed on YouTube. This includes content produced by Hamas and Palestinian Islamic Jihad (PIJ).
- In addition, YouTube has a dedicated button underneath every video on YouTube to flag content with the option to mark it as 'promotes terrorism.'
- Per YouTube’s Violent or Graphic Content Policies, YouTube prohibits violent or gory content intended to shock or disgust viewers. Additionally, content encouraging others to commit violent acts against individuals or a defined group of people, including the Jewish, Muslim and other religious communities, is not allowed on YouTube.
- Per YouTube’s Harassment Policies, content that promotes harmful conspiracy theories or targets individuals based on their protected group status is not allowed on YouTube. Additionally, content that realistically simulates deceased minors or victims of deadly or well-documented major violent events describing their death or violence experienced, is not allowed on YouTube.
- Per YouTube’s Misinformation Policies, content containing certain types of misinformation that can cause real-world harm, including certain types of misattributed content, is not allowed on YouTube.
Indication of impact - 51.4.6
- Removed over 140,000 videos and over 6,000 channels;
- Removed over 500 million comments.
Since June 2025, YouTube’s enforcement continues within its standard enforcement systems, which detect violations of its content policies, including those pertaining to misinformation, hate speech, and graphic violence. This data can be found in the Removal section of YouTube's Community Guidelines Transparency Report.
Empowering Users
Outline approaches pertinent to this chapter, highlighting similarities/commonalities and differences with regular enforcement.
Specific Action applied - 51.5.1
Description of intervention - 51.5.2
Indication of impact - 51.5.3
Specific Action applied - 51.5.4
Description of intervention - 51.5.5
Indication of impact - 51.5.6
Specific Action applied - 51.5.7
Description of intervention - 51.5.8
Indication of impact - 51.5.9
Specific Action applied - 51.5.10
Description of intervention - 51.5.11
YouTube’s Top News and Breaking News shelves are surfacing at the top of search results related to the attacks in Israel and on the homepage, prominently featuring content from high-quality news sources.
Indication of impact - 51.5.12
Empowering the Research Community
Outline approaches pertinent to this chapter, highlighting similarities/commonalities and differences with regular enforcement.