TikTok

Report March 2026

Submitted
TikTok’s mission is to inspire creativity and bring joy. With more than 200 million people across Europe coming to TikTok every month, including 178 million in the EU, it’s natural for people to hold different opinions. That’s why we focus on a shared set of facts when it comes to issues that affect people’s safety. A safe, authentic, and trustworthy experience is essential to achieving our goals. Transparency plays a key role in building that trust, allowing online communities and society to assess how TikTok meets its regulatory obligations. As a signatory to the Code of Conduct on Disinformation (the Code), TikTok is committed to sharing clear insights into the actions we take.

TikTok takes disinformation extremely seriously. We are committed to preventing its spread, promoting authoritative information, and supporting media literacy initiatives that strengthen community resilience.

We prioritise proactive content moderation, with the vast majority of violative content removed before it is reported. In H2 2025, more than 98% of videos violating our Integrity and Authenticity policies were removed proactively worldwide.

We continue to address emerging behaviours and risks through our Digital Services Act (DSA) compliance programme, which the Code has operated under since July 2025.

Our actions under the Code demonstrate TikTok’s strong commitment to combating disinformation while ensuring transparency and accountability to our community and regulators.

Please see the sections below for information about our work under specific commitments, or download the report as a PDF.

Download PDF

Commitment 17
In light of the European Commission's initiatives in the area of media literacy, including the new Digital Education Action Plan, Relevant Signatories commit to continue and strengthen their efforts in the area of media literacy and critical thinking, also with the aim to include vulnerable groups.
We signed up to the following measures of this commitment
Measure 17.1 Measure 17.2 Measure 17.3
In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?
Yes

If yes, list these implementation measures here
We have 14 ongoing media literacy and critical thinking skills campaigns in Europe (12 in EU/EEA—Denmark, Finland, France, Germany, Ireland, Italy, Romania, Spain, Sweden, Netherlands, Poland and Portugal; 2 in wider European countries—Georgia and Moldova).

  • We ran 8 temporary media literacy election integrity campaigns in advance of regional elections, most in collaboration with our fact-checking and media literacy partners:
    • 7 in the EU 
      • Czechia (Parliamentary election): Demagog.cz
      • Portugal (local election): Poligrafo
      • Estonia (local election): Lead Stories
      • Ireland (presidential election): The Journal
      • Netherlands (parliamentary election)
      • Denmark (local and municipal election): Sikker Digital
      • Portugal (presidential election): Polígrafo
    • 1 in Norway (parliamentary election)
  • Following wildfires in Portugal and Spain, we launched an in-app guide to provide users with guidance on interacting with sensitive content during natural disasters. The guide links to TikTok's tragic event support guide and authoritative third party resources (PT)(ES) of information about aid and relief support. The intervention is available in all in-app languages.
  • Following protests in France, we launched an in-app guide to provide users with guidance on interacting with sensitive content when events are unfolding rapidly. The guide links to TikTok's Community Guidelines and Well-being Guide
  • Continued our in-app interventions, including video tags, search interventions and in-app information centres, available in 23 official EU languages and Norwegian and Icelandic for EEA users, around elections, the Israel-Hamas Conflict, Holocaust Education, and the War in Ukraine.
  • Continued to support mental well-being awareness and literacy and to combat misinformation with reliable content through the WHO's Fides network, a diverse community of trusted healthcare professionals and content creators in a number of countries, including France.
  • We launched a $2 Million AI Literacy fund in partnership with more than 20 civil society organisations across 12 markets worldwide. The ad credit fund is designed to support the creation of educational content that will appear in For You feeds. This initiative launched alongside several new company updates to spot, shape and understand AI-generated content.
  • Brought greater transparency about our systems and our integrity and authenticity efforts to our community by sharing regular insights and updates.  In H2 2025, we launched a new:
    • Transparency Center Global Elections Hub , including dedicated coverage of elections across Europe, the Middle East, and Africa. The Hub outlines our policies, product features, and moderation practices that help protect platform integrity during elections. Throughout this reporting period, we regularly updated the Hub with information on our safety efforts in markets with active elections, including Croatia, Germany, Netherlands, Portugal, Poland and Ireland. 
Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?
N/A
If yes, which further implementation measures do you plan to put in place in the next 6 months?
N/A
Measure 17.1
Relevant Signatories will design and implement or continue to maintain tools to improve media literacy and critical thinking, for instance by empowering users with context on the content visible on services or with guidance on how to evaluate online content.
QRE 17.1.1
In addition to actioning content that violates our Integrity and Authenticity policies, we continue to dedicate resources to: expanding our in-app measures that show users additional context on certain content (e.g., natural disasters and rapidly unfolding events); redirecting them to authoritative information; and making these tools available in 22 EU official languages (plus, for EEA users, Norwegian and Icelandic).

We work with external experts to combat harmful misinformation. For example, we work with the World Health Organisation (WHO) on medical information, and our global fact-checking partners, taking into account their feedback to continually identify new topics and consider which tools may be best suited for raising awareness around that topic.

We deploy a combination of in-app user intervention tools on topical issues such as elections , the Israel-Hamas Conflict, Holocaust Education, and the War in Ukraine.

Video notice tags. 

A video notice tag is an information bar at the bottom of a video which is automatically applied to a specific word or hashtag (or set of hashtags). The information bar is clickable and invites users to “Learn more about [the topic]”. Users will be directed to an in-app guide, or reliable third party resource, as appropriate.

Search intervention. If users search for terms associated with a topic, they will be presented with a banner encouraging them to verify the facts and providing a link to a trusted source of information. Search interventions are not deployed for search terms that violate our Community Guidelines, which are actioned according to our policies. 

QRE 17.1.1
Relevant Signatories will outline the tools they develop or maintain that are relevant to this commitment and report on their deployment in each Member State.
In addition to actioning content that violates our Integrity & Authenticity policies, we continue to dedicate resources to: expanding our in-app measures that show users additional context on certain content (e.g., natural disasters and rapidly unfolding events); redirecting them to authoritative information; and making these tools available in 23 EU official languages (plus, for EEA users, Norwegian & Icelandic).

We work with external experts to combat harmful misinformation. For example, we work with the World Health Organisation (WHO) on medical information, and our global fact-checking partners, taking into account their feedback, as well as user feedback, to continually identify new topics and consider which tools may be best suited for raising awareness around that topic.

We deploy a combination of in-app user intervention tools on topical issues such as elections , the Israel-Hamas Conflict, Holocaust Education, Mpox and the War in Ukraine.

Video notice tags. 

A video notice tag is an information bar at the bottom of a video which is automatically applied to a specific word or hashtag (or set of hashtags). The information bar is clickable and invites users to “Learn more about [the topic]”. Users will be directed to an in-app guide, or reliable third party resource, as appropriate.

Search intervention. 

If users search for terms associated with a topic, they will be presented with a banner encouraging them to verify the facts and providing a link to a trusted source of information. Search interventions are not deployed for search terms that violate our Community Guidelines, which are actioned according to our policies.