TikTok

Report September 2025

Submitted
TikTok’s mission is to inspire creativity and bring joy. With a global community of more than a billion users, it’s natural for people to hold different opinions. That’s why we focus on a shared set of facts when it comes to issues that affect people’s safety. A safe, authentic, and trustworthy experience is essential to achieving our goals. Transparency plays a key role in building that trust, allowing online communities and society to assess how TikTok meets its regulatory obligations. As a signatory to the Code of Conduct on Disinformation (the Code), TikTok is committed to sharing clear insights into the actions we take.

TikTok takes disinformation extremely seriously. We are committed to preventing its spread, promoting authoritative information, and supporting media literacy initiatives that strengthen community resilience.

We prioritise proactive content moderation, with the vast majority of violative content removed before it is viewed or reported. In H1 2025, more than 97% of videos violating our Integrity and Authenticity policies were removed proactively worldwide.

We continue to address emerging behaviours and risks through our Digital Services Act (DSA) compliance programme, which the Code has operated under since July 2025. This includes a range of measures to protect users, detailed on our European Online Safety Hub. Our actions under the Code demonstrate TikTok’s strong commitment to combating disinformation while ensuring transparency and accountability to our community and regulators.

Our full executive summary can be read by downloading our report using the link below.

Download PDF

Commitment 24
Relevant Signatories commit to inform users whose content or accounts has been subject to enforcement actions (content/accounts labelled, demoted or otherwise enforced on) taken on the basis of violation of policies relevant to this section (as outlined in Measure 18.2), and provide them with the possibility to appeal against the enforcement action at issue and to handle complaints in a timely, diligent, transparent, and objective manner and to reverse the action without undue delay where the complaint is deemed to be founded.
We signed up to the following measures of this commitment
Measure 24.1
In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?
No
If yes, list these implementation measures here
N/A
Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?
N/A
If yes, which further implementation measures do you plan to put in place in the next 6 months?
N/A
Measure 24.1
Relevant Signatories commit to provide users with information on why particular content or accounts have been labelled, demoted, or otherwise enforced on, on the basis of violation of policies relevant to this section, as well as the basis for such enforcement action, and the possibility for them to appeal through a transparent mechanism.


QRE 24.1.1
Relevant Signatories will report on the availability of their notification and appeals systems across Member States and languages and provide details on the steps of the appeals procedure.
Users in all EU member states are notified by an in-app notification in their relevant local language where the following action is taken:
  • removal or otherwise restriction of access to their content;
  • a ban of the account;
  • restriction of their access to a feature (such as LIVE); or
  • restriction of their ability to monetise. 

Such notifications are provided in near real time after action has been taken (i.e. generally within several seconds or up to a few minutes at most). 

Where we have taken any of these decisions, an in-app inbox notification sets out the violation deemed to have taken place, along with an option for users to “disagree” and submit an appeal. Users can submit appeals within 180 days of being notified of the decision they want to appeal. Further information, including about how to appeal a decision is set out here.

All such appeals raised will be queued for review by our specialised human moderators so as to ensure that context is adequately taken into account in reaching a determination. Users can monitor the status and view the results of their appeal within their in-app inbox. 

As mentioned above, our users have the ability to share feedback with us to the extent that they don't agree with the result of their appeal. They can do so by using the in-app function which allows them to "report a problem". We are continuously taking user feedback into consideration in order to identify areas of improvement within the appeals process.

SLI 24.1.1
Relevant Signatories provide information on the number and nature of enforcement actions for policies described in response to Measure 18.2, the numbers of such actions that were subsequently appealed, the results of these appeals, information, and to the extent possible metrics, providing insight into the duration or effectiveness of processing of appeals process, and publish this information on the Transparency Centre.
Methodology of data measurement:

The number of appeals/overturns is based on the country in which the video being appealed/overturned was posted. These numbers are only related to our Misinformation, Civic and Election Integrity and Edited media and AIGC policies.
Country Number of Appeals of videos removed for violation of misinformation policy Number of overturns of appeals for violation of misinformation policy Appeal success rate of videos removed for violation of misinformation policy Number of Appeals of videos removed for violation of Civic and Election Integrity policy Number of overturns of appeals for violation of Civic and Election Integrity policy Appeal success rate of videos removed for violation of Civic and Election Integrity policy Number of Appeals of videos removed for violation of Synthetic and Manipulated Media Number of overturns of appeals for violation of Synthetic and Manipulated Media Appeal success rate of videos removed for violation of Synthetic and Manipulated Media
Austria 609 422 69.30% 160 124 77.50% 27 24 88.90%
Belgium 809 674 83.30% 246 196 79.70% 55 48 87.30%
Bulgaria 582 283 48.60% 58 46 79.30% 21 21 100.00%
Croatia 91 55 60.40% 14 11 78.60% 7 2 28.60%
Cyprus 92 59 64.10% 20 15 75.00% 17 11 64.70%
Czech Republic 1,453 468 32.20% 162 137 84.60% 72 39 54.20%
Denmark 311 226 72.70% 102 84 82.40% 40 32 80.00%
Estonia 84 49 58.30% 15 10 66.70% 8 7 87.50%
Finland 207 139 67.10% 72 58 80.60% 27 21 77.80%
France 6,935 6,296 90.80% 709 639 90.10% 421 396 94.10%
Germany 12,837 8,939 69.60% 2,844 2,327 81.80% 716 542 75.70%
Greece 705 425 60.30% 173 139 80.30% 55 37 67.30%
Hungary 228 131 57.50% 133 102 76.70% 6 4 66.70%
Ireland 948 765 80.70% 108 97 89.80% 36 32 88.90%
Italy 4,266 3,523 82.60% 1,188 1,048 88.20% 143 132 92.30%
Latvia 110 77 70.00% 20 13 65.00% 42 19 45.20%
Lithuania 101 84 83.20% 16 15 93.80% 22 14 63.60%
Luxembourg 35 29 82.90% 9 7 77.80% 5 3 60.00%
Malta 28 24 85.70% 0 0 0.00% 0 0 0.00%
Netherlands 1,732 1,441 83.20% 290 236 81.40% 92 77 83.70%
Poland 5,004 2,065 41.30% 423 332 78.50% 126 87 69.00%
Portugal 600 393 65.50% 154 129 83.80% 18 14 77.80%
Romania 5,175 1,539 29.70% 1,066 855 80.20% 158 78 49.40%
Slovakia 569 140 24.60% 20 17 85.00% 27 19 70.40%
Slovenia 96 48 50.00% 7 6 85.70% 12 10 83.30%
Spain 3,231 2,844 88.00% 464 416 89.70% 143 130 90.90%
Sweden 658 550 83.60% 231 176 76.20% 48 40 83.30%
Iceland 13 11 84.60% 4 4 100.00% 2 2 100.00%
Liechtenstein 2 2 100.00% 0 0 0.00% 0 0 0.00%
Norway 278 228 82.00% 80 68 85.00% 32 28 87.50%
Total EU 47,496 31,688 66.70% 8,704 7,235 83.10% 2,344 1,839 78.50%
Total EEA 47,789 31,929 66.80% 8,788 7,307 83.10% 2,378 1,869 78.60%