TikTok

Report March 2026

Submitted
TikTok’s mission is to inspire creativity and bring joy. With more than 200 million people across Europe coming to TikTok every month, including 178 million in the EU, it’s natural for people to hold different opinions. That’s why we focus on a shared set of facts when it comes to issues that affect people’s safety. A safe, authentic, and trustworthy experience is essential to achieving our goals. Transparency plays a key role in building that trust, allowing online communities and society to assess how TikTok meets its regulatory obligations. As a signatory to the Code of Conduct on Disinformation (the Code), TikTok is committed to sharing clear insights into the actions we take.

TikTok takes disinformation extremely seriously. We are committed to preventing its spread, promoting authoritative information, and supporting media literacy initiatives that strengthen community resilience.

We prioritise proactive content moderation, with the vast majority of violative content removed before it is reported. In H2 2025, more than 98% of videos violating our Integrity and Authenticity policies were removed proactively worldwide.

We continue to address emerging behaviours and risks through our Digital Services Act (DSA) compliance programme, which the Code has operated under since July 2025.

Our actions under the Code demonstrate TikTok’s strong commitment to combating disinformation while ensuring transparency and accountability to our community and regulators.

Please see the sections below for information about our work under specific commitments, or download the report as a PDF.

Download PDF

Commitment 16
Relevant Signatories commit to operate channels of exchange between their relevant teams in order to proactively share information about cross-platform influence operations, foreign interference in information space and relevant incidents that emerge on their respective services, with the aim of preventing dissemination and resurgence on other services, in full compliance with privacy legislation and with due consideration for security and human rights risks.
We signed up to the following measures of this commitment
Measure 16.1 Measure 16.2
In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?
Yes
If yes, list these implementation measures here
We continue to engage in the subgroups set up for insights sharing between signatories and the Commission. For example, we participated in cross-industry forums such as EU elections roundtables in markets including Czechia, Netherlands, Ireland, and Estonia. 
Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?
N/A
If yes, which further implementation measures do you plan to put in place in the next 6 months?
N/A
Measure 16.1
Relevant Signatories will share relevant information about cross-platform information manipulation, foreign interference in information space and incidents that emerge on their respective services for instance via a dedicated sub-group of the permanent Task-force or via existing fora for exchanging such information.
QRE 16.1.1
Central to our strategy for identifying and removing CIO on our platforms is working with our stakeholders, including civil society and user reports. This approach facilitates us - and others - disrupting the network’s operations in their early stages. In addition to continuously enhancing our in-house capabilities, we proactively engage in comprehensive reviews of our peers' publicly disclosed findings and swiftly implement necessary actions in alignment with our policies.

To provide more regular and detailed updates about the CIO we disrupt, we have a dedicated Transparency Report on covert influence operations, which is available in TikTok’s Transparency Centre. In this report, we also have information about operations that we have previously removed and that have attempted to return to our platform with new accounts. The insights and metrics in this report aim to inform industry peers and the research community. 

We share relevant insights and metrics within our transparency reports, which aim to inform industry peers and the research community. We also review relevant insights and metrics from other industry peers to cross-compare for any similar behaviour on TikTok.

We continue to engage in the subgroups set up for insights sharing between signatories and the Commission. For example, we participated in cross-industry forums such as EU elections roundtable in markets including Czechia, Netherlands, Ireland, and Estonia. 

As we have detailed in other chapters to this report, we have robust monetisation integrity policies in place and have established joint operating procedures between specialist CIO investigations teams and monetisation integrity teams to work on joint investigations of CIOs involving monetised products. 

QRE 16.1.1
Relevant Signatories will disclose the fora they use for information sharing as well as information about learnings derived from this sharing.
N/A