TikTok

Report September 2025

Submitted
TikTok’s mission is to inspire creativity and bring joy. With a global community of more than a billion users, it’s natural for people to hold different opinions. That’s why we focus on a shared set of facts when it comes to issues that affect people’s safety. A safe, authentic, and trustworthy experience is essential to achieving our goals. Transparency plays a key role in building that trust, allowing online communities and society to assess how TikTok meets its regulatory obligations. As a signatory to the Code of Conduct on Disinformation (the Code), TikTok is committed to sharing clear insights into the actions we take.

TikTok takes disinformation extremely seriously. We are committed to preventing its spread, promoting authoritative information, and supporting media literacy initiatives that strengthen community resilience.

We prioritise proactive content moderation, with the vast majority of violative content removed before it is viewed or reported. In H1 2025, more than 97% of videos violating our Integrity and Authenticity policies were removed proactively worldwide.

We continue to address emerging behaviours and risks through our Digital Services Act (DSA) compliance programme, which the Code has operated under since July 2025. This includes a range of measures to protect users, detailed on our European Online Safety Hub. Our actions under the Code demonstrate TikTok’s strong commitment to combating disinformation while ensuring transparency and accountability to our community and regulators.

Our full executive summary can be read by downloading our report using the link below.

Download PDF

Commitment 23
Relevant Signatories commit to provide users with the functionality to flag harmful false and/or misleading information that violates Signatories policies or terms of service.
We signed up to the following measures of this commitment
Measure 23.1 Measure 23.2
In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?
Yes
If yes, list these implementation measures here
  • In line with our DSA requirements, we continued to provide a dedicated reporting channel, and appeals process for users who disagree with the outcome, for our community in the European Union to ‘Report Illegal Content,’ enabling users to alert us to content they believe breaches the law.

Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?
N/A
If yes, which further implementation measures do you plan to put in place in the next 6 months?
N/A
Measure 23.1
Relevant Signatories will develop or continue to make available on all their services and in all Member States languages in which their services are provided a user-friendly functionality for users to flag harmful false and/or misleading information that violates Signatories' policies or terms of service. The functionality should lead to appropriate, proportionate and consistent follow-up actions, in full respect of the freedom of expression.
QRE 23.1.1
Relevant Signatories will report on the availability of flagging systems for their policies related to harmful false and/or misleading information across EU Member States and specify the different steps that are required to trigger the systems.
We provide users with simple, intuitive ways to report/flag content in-app for any breach of our Terms of Service or Community Guidelines including for harmful misinformation in each EU Member State and in an official language of the European Union.
  • By ‘long-pressing’ (e.g., clicking for 3 seconds) on the video content and selecting the “Report” option. 
  • By selecting the “Share” button available on the right-hand side of the video content and then selecting the “Report” option.
The user is then shown categories of reporting reasons from which to select (which align with the harms our Community Guidelines seek to address). In 2024, we updated this feature to make the “Misinformation” categories more intuitive and allow users to report with increased granularity. 

In line with our DSA requirements, we continued to provide a dedicated reporting channel, and appeals process for our community in the European Union to ‘Report Illegal Content,’ enabling users to alert us to content they believe breaches the law.

People can report TikTok content or accounts without needing to sign in or have an account by accessing the Report function using the “More options (…)” menu on videos or profiles in their browser, or through our “Report Inappropriate content” webform which is available in our  Help Centre. Harmful misinformation can be reported across content features such as video, comment, search, hashtag, sound, or account.