TikTok

Report September 2025

Submitted
TikTok’s mission is to inspire creativity and bring joy. With a global community of more than a billion users, it’s natural for people to hold different opinions. That’s why we focus on a shared set of facts when it comes to issues that affect people’s safety. A safe, authentic, and trustworthy experience is essential to achieving our goals. Transparency plays a key role in building that trust, allowing online communities and society to assess how TikTok meets its regulatory obligations. As a signatory to the Code of Conduct on Disinformation (the Code), TikTok is committed to sharing clear insights into the actions we take.

TikTok takes disinformation extremely seriously. We are committed to preventing its spread, promoting authoritative information, and supporting media literacy initiatives that strengthen community resilience.

We prioritise proactive content moderation, with the vast majority of violative content removed before it is viewed or reported. In H1 2025, more than 97% of videos violating our Integrity and Authenticity policies were removed proactively worldwide.

We continue to address emerging behaviours and risks through our Digital Services Act (DSA) compliance programme, which the Code has operated under since July 2025. This includes a range of measures to protect users, detailed on our European Online Safety Hub. Our actions under the Code demonstrate TikTok’s strong commitment to combating disinformation while ensuring transparency and accountability to our community and regulators.

Our full executive summary can be read by downloading our report using the link below.

Download PDF

Commitment 1
Relevant signatories participating in ad placements commit to defund the dissemination of disinformation, and improve the policies and systems which determine the eligibility of content to be monetised, the controls for monetisation and ad placement, and the data to report on the accuracy and effectiveness of controls and services around ad placements.
We signed up to the following measures of this commitment
Measure 1.1 Measure 1.2 Measure 1.3 Measure 1.4 Measure 1.5 Measure 1.6
In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?
Yes
If yes, list these implementation measures here
  • Continued to improve and enforce our five granular harmful misinformation ad policies in the EEA. As mentioned in our H2 2024 report, the policies cover:
    • Medical Misinformation
    • Dangerous Misinformation
    • Synthetic and Manipulated Media
    • Dangerous Conspiracy Theories 
    • Climate Disinformation 
  • We continue to engage in the Task-force and its working groups and subgroups such as the working subgroup on Elections (Crisis Response).

Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?
N/A

If yes, which further implementation measures do you plan to put in place in the next 6 months?
N/A

Measure 1.1
Relevant Signatories involved in the selling of advertising, inclusive of media platforms, publishers and ad tech companies, will deploy, disclose, and enforce policies with the aims of: - first avoiding the publishing and carriage of harmful Disinformation to protect the integrity of advertising supported businesses - second taking meaningful enforcement and remediation steps to avoid the placement of advertising next to Disinformation content or on sources that repeatedly violate these policies; and - third adopting measures to enable the verification of the landing / destination pages of ads and origin of ad placement.
TikTok did not subscribe to this measure as outlined in the January 2025 subscription document.
QRE 1.1.1
Signatories will disclose and outline the policies they develop, deploy, and enforce to meet the goals of Measure 1.1 and will link to relevant public pages in their help centres.
N/A
SLI 1.1.2
Please insert the relevant data
N/A