TikTok

Report March 2026

Submitted
TikTok’s mission is to inspire creativity and bring joy. With more than 200 million people across Europe coming to TikTok every month, including 178 million in the EU, it’s natural for people to hold different opinions. That’s why we focus on a shared set of facts when it comes to issues that affect people’s safety. A safe, authentic, and trustworthy experience is essential to achieving our goals. Transparency plays a key role in building that trust, allowing online communities and society to assess how TikTok meets its regulatory obligations. As a signatory to the Code of Conduct on Disinformation (the Code), TikTok is committed to sharing clear insights into the actions we take.

TikTok takes disinformation extremely seriously. We are committed to preventing its spread, promoting authoritative information, and supporting media literacy initiatives that strengthen community resilience.

We prioritise proactive content moderation, with the vast majority of violative content removed before it is reported. In H2 2025, more than 98% of videos violating our Integrity and Authenticity policies were removed proactively worldwide.

We continue to address emerging behaviours and risks through our Digital Services Act (DSA) compliance programme, which the Code has operated under since July 2025.

Our actions under the Code demonstrate TikTok’s strong commitment to combating disinformation while ensuring transparency and accountability to our community and regulators.

Please see the sections below for information about our work under specific commitments, or download the report as a PDF.

Download PDF

Commitment 24
Relevant Signatories commit to inform users whose content or accounts has been subject to enforcement actions (content/accounts labelled, demoted or otherwise enforced on) taken on the basis of violation of policies relevant to this section (as outlined in Measure 18.2), and provide them with the possibility to appeal against the enforcement action at issue and to handle complaints in a timely, diligent, transparent, and objective manner and to reverse the action without undue delay where the complaint is deemed to be founded.
We signed up to the following measures of this commitment
Measure 24.1
In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?
No
If yes, list these implementation measures here
N/A
Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?
N/A
If yes, which further implementation measures do you plan to put in place in the next 6 months?
N/A
Measure 24.1
Relevant Signatories commit to provide users with information on why particular content or accounts have been labelled, demoted, or otherwise enforced on, on the basis of violation of policies relevant to this section, as well as the basis for such enforcement action, and the possibility for them to appeal through a transparent mechanism.


QRE 24.1.1
Relevant Signatories will report on the availability of their notification and appeals systems across Member States and languages and provide details on the steps of the appeals procedure.
Users in all EU member states are notified by an in-app notification in their relevant local language where the following action is taken:
  • removal or otherwise restriction of access to their content;
  • a ban of the account;
  • restriction of their access to a feature (such as LIVE); or
  • restriction of their ability to monetise. 

Such notifications are provided in near real time after action has been taken (i.e. generally within several seconds or up to a few minutes at most). 

Where we have taken any of these decisions, an in-app inbox notification sets out the violation deemed to have taken place, along with an option for users to “disagree” and submit an appeal. Users can submit appeals within 180 days of being notified of the decision they want to appeal. Further information, including about how to appeal a decision is set out here.

All such appeals raised will be queued for review by our specialised human moderators so as to ensure that context is adequately taken into account in reaching a determination. Users can monitor the status and view the results of their appeal within their in-app inbox. 

As mentioned above, our users have the ability to share feedback with us to the extent that they don't agree with the result of their appeal. They can do so by using the in-app function which allows them to "report a problem". We are continuously taking user feedback into consideration in order to identify areas of improvement within the appeals process.

SLI 24.1.1
Relevant Signatories provide information on the number and nature of enforcement actions for policies described in response to Measure 18.2, the numbers of such actions that were subsequently appealed, the results of these appeals, information, and to the extent possible metrics, providing insight into the duration or effectiveness of processing of appeals process, and publish this information on the Transparency Centre.
Methodology of data measurement:

The number of appeals/overturns is based on the country in which the video being appealed/overturned was posted. These numbers are only related to our Misinformation, Civic and Election Integrity and Edited Media and AI-Generated Content (AIGC) policies.

Number of Appeals of videos removed for violation of misinformation policy Number of overturns of appealsÊ for violation of misinformation policy Appeal success rate of videos removedÊ for violation of misinformation policy Number of appeals of videos removed for violation of Civic and Election Integrity policy Number of overturns of appeals for violation of Civic and Election Integrity policy Appeal success rate of videos removed for violation of Civic and Election Integrity policy Number of appeals of videos removed for violation of Edited Media and AI-Generated Content (AIGC) Number of overturns of appeals for violation of Edited Media and AI-Generated Content (AIGC) Appeal success rate of videos removed for violation of Edited Media and AI-Generated Content (AIGC)
Austria 885 683 77.20% 152 128 84.20% 645 574 89.00%
Belgium 1,047 877 83.80% 171 145 84.80% 590 521 88.30%
Bulgaria 1,375 983 71.50% 71 60 84.50% 333 278 83.50%
Croatia 133 106 79.70% 13 12 92.30% 256 229 89.50%
Cyprus 155 122 78.70% 13 11 84.60% 246 213 86.60%
Czech Republic 1,119 936 83.60% 110 96 87.30% 479 418 87.30%
Denmark 546 441 80.80% 88 69 78.40% 567 540 95.20%
Estonia 181 137 75.70% 23 15 65.20% 362 308 85.10%
Finland 424 346 81.60% 61 47 77.00% 429 388 90.40%
France 7,543 6,307 83.60% 362 309 85.40% 3,308 2,905 87.80%
Germany 14,569 10,635 73.00% 1,475 1,191 80.70% 10,927 9,547 87.40%
Greece 1,206 1,022 84.70% 168 150 89.30% 461 395 85.70%
Hungary 362 285 78.70% 189 163 86.20% 242 205 84.70%
Ireland 856 716 83.60% 85 77 90.60% 416 388 93.30%
Italy 4,912 3,583 72.90% 367 309 84.20% 2,796 2,399 85.80%
Latvia 253 227 89.70% 23 21 91.30% 544 487 89.50%
Lithuania 275 201 73.10% 28 25 89.30% 470 411 87.40%
Luxembourg 60 50 83.30% 7 6 85.70% 86 81 94.20%
Malta 33 25 75.80% 6 6 100.00% 62 54 87.10%
Netherlands 4,467 3,223 72.20% 330 246 74.50% 4,138 3,671 88.70%
Poland 5,049 3,488 69.10% 284 238 83.80% 1,910 1,563 81.80%
Portugal 823 648 78.70% 76 58 76.30% 291 253 86.90%
Romania 8,645 4,885 56.50% 1,219 777 63.70% 2,528 2,208 87.30%
Slovakia 358 287 80.20% 19 15 78.90% 337 290 86.10%
Slovenia 125 104 83.20% 11 5 45.50% 168 148 88.10%
Spain 5,671 4,159 73.30% 374 328 87.70% 3,988 3,608 90.50%
Sweden 1,039 831 80.00% 163 130 79.80% 830 701 84.50%
Iceland 36 33 91.70% 7 6 85.70% 34 33 97.10%
Liechtenstein 4 2 50.00% 0 0 0.00% 3 3 100.00%
Norway 547 429 78.40% 105 87 82.90% 527 477 90.50%
Total EU 62,111 45,307 72.90% 5,888 4,637 78.80% 37,409 32,783 87.60%
Total EEA 62,698 45,771 73.00% 6,000 4,730 78.80% 37,973 33,296 87.70%