TikTok

Report March 2026

Submitted
TikTok’s mission is to inspire creativity and bring joy. With more than 200 million people across Europe coming to TikTok every month, including 178 million in the EU, it’s natural for people to hold different opinions. That’s why we focus on a shared set of facts when it comes to issues that affect people’s safety. A safe, authentic, and trustworthy experience is essential to achieving our goals. Transparency plays a key role in building that trust, allowing online communities and society to assess how TikTok meets its regulatory obligations. As a signatory to the Code of Conduct on Disinformation (the Code), TikTok is committed to sharing clear insights into the actions we take.

TikTok takes disinformation extremely seriously. We are committed to preventing its spread, promoting authoritative information, and supporting media literacy initiatives that strengthen community resilience.

We prioritise proactive content moderation, with the vast majority of violative content removed before it is reported. In H2 2025, more than 98% of videos violating our Integrity and Authenticity policies were removed proactively worldwide.

We continue to address emerging behaviours and risks through our Digital Services Act (DSA) compliance programme, which the Code has operated under since July 2025.

Our actions under the Code demonstrate TikTok’s strong commitment to combating disinformation while ensuring transparency and accountability to our community and regulators.

Please see the sections below for information about our work under specific commitments, or download the report as a PDF.

Download PDF

Elections 2025
[Note: Signatories are requested to provide information relevant to their particular response to the threats and challenges they observed on their service(s). They ensure that the information below provides an accurate and complete report of their relevant actions. As operational responses to crisis/election situations can vary from service to service, an absence of information should not be considered a priori a shortfall in the way a particular service has responded. Impact metrics are accurate to the best of signatories’ abilities to measure them].
Threats observed or anticipated
Ireland Election 2025

We have comprehensive measures in place to anticipate and address the risks associated with electoral processes, including the risks associated with election misinformation in the context of the Irish presidential election held on 24 October 2025.

In advance of the election, a dedicated election Task-Force was established to proactively assess potential risks. Through cross-functional consultations, the team identified key threats—including the spread of AI-generated deepfakes and misinformation—and developed response strategies to mitigate them before they could gain traction on the platform.

Throughout the election, we monitored for and actioned inauthentic behavior, and removed content that violated our Community Guidelines.

Czechia Federal Election 2025
 
We have comprehensive measures in place to anticipate and address risks associated with electoral processes, including risks associated with election misinformation in the context of the Czech federal election held on 3 & 4 October 2025. In advance of the election, a core election Task-Force was formed, and consultations between cross-functional teams helped to identify and design response strategies.

TikTok did not observe major threats during the Czech election. Through the election, we monitored for and actioned inauthentic behavior and removed content that violated our Community Guidelines.

Netherlands Election 2025:

We have comprehensive measures in place to anticipate and address risks associated with electoral processes, including risks associated with election misinformation in the context of the Dutch parliamentary election held on 29 October 2025. In advance of the election, a core election Task-Force was formed, and consultations between cross-functional teams helped to identify and design response strategies.

TikTok did not observe major threats during the Dutch election. Through the election, we monitored for and actioned inauthentic behavior and removed content that violated our Community Guidelines.



Mitigations in place
Ireland Elections 2025:

Enforcing our policies

(I) Monitoring capabilities

We have dedicated Trust and Safety professionals working to keep our platform safe. As they usually do, our teams worked alongside technology to ensure that we were consistently enforcing our rules to detect and remove misinformation, covert influence operations, and other content and behaviour that can increase during an election period. In advance of the election, we had proactive data monitoring, trend detection and regular monitoring of enriched keywords and accounts.


(II) Mission Control Centre: internal cross-functional collaboration


As part of our advance preparations ahead of the Irish presidential election, we established a dedicated Mission Control Centre (MCC) bringing together employees from multiple specialist teams within our safety department. Through the MCC, our teams were able to provide consistent and dedicated coverage of potential election-related issues in the run-up to, during,and immediately after the election.


(III) Integrity and Authenticity policies


We prioritise proactive content moderation, with the vast majority of violative content removed before it is viewed or reported. In H2 2025, more than 98% of videos violating our Integrity and Authenticity policies were removed proactively worldwide.


(IV) Fact-checking


Our global fact-checking programme is a critical part of our layered approach to detecting harmful misinformation in the context of elections. The core objective of the fact-checking program is to leverage the expertise of external fact-checking organisations to help assess the accuracy of potentially harmful claims that are difficult to verify.


Within Europe, we partner with 13 fact-checking organisations who provide fact-checking coverage in 25 languages (22 official EU languages plus Russian, Ukrainian and Turkish). Reuters serves as the fact-checking partner for Ireland.


(V) Deterring covert influence operations


We prohibit covert influence operations and remain constantly vigilant against attempts to use deceptive behaviours and manipulate our platform. We proactively seek and continuously investigate leads for potential influence operations. We're also working with government authorities and encourage them to share any intelligence so that we can work together to ensure election integrity. More detail on our policy against covert influence operations is published on our website.


(VI) Tackling misleading AI-generated content


Creators are required to label any realistic AI-generated content (AIGC) and we have an AI-generated content label to help people do this. TikTok has a ‘Edited Media and AI-Generated Content (AIGC)’ policy, which prohibits AIGC showing fake authoritative sources or crisis events, or falsely showing public figures in certain contexts including being bullied, making an endorsement, or being endorsed.


(VII) Government, Politician, and Political Party Accounts (GPPPAs)


Many political leaders, ministers, and political parties have a presence on TikTok.These politicians and parties play an important role on our platform - we believe that verified accounts belonging to politicians and institutions provide the electorate with another route to access their representatives, and additional trusted voices in the shared fight against misinformation.


We strongly recommend GPPPAs have their accounts verified by TikTok. Verified badges help users make informed choices about the accounts they choose to follow. It is also an easy way for notable figures to let users know they’re seeing authentic content, and it helps to build trust among high-profile accounts and their followers.


Directing people to trusted sources


(I) Investing in media literacy

We invest in media literacy campaigns as a counter-misinformation strategy. From 24 Sept to 25 Oct 2025, we launched an in-app Election Centre to provide users with up-to-date information about the 2025 Irish presidential election. The centre contained a section about spotting misinformation, which included videos created in partnership with The Journal's fact-checking unit.

External engagement at the national and EU levels

(I) Rapid Response System: external collaboration with COPD Signatories 

Throughout the election period, our teams maintained communication with XFNs as part of the COCD Rapid Response System (RRS). We received 10 reports via the RRS related to AIGC, misinformation and impersonation, which were rapidly addressed. Actions included banning of accounts and content removals for violation of Community Guidelines.

(II) Engagement with local experts

To further promote election integrity, and inform our approach to the Irish Election, we organised an Election Speaker Series with local fact-checking partner Reuters who shared their insights and market expertise with our internal teams.

Czech Federal Elections:

Enforcing our policies

(I) Monitoring capabilities


We have dedicated Trust and Safety professionals working to keep our platform safe. As they usually do, our teams worked alongside technology to ensure that we were consistently enforcing our rules to detect and remove misinformation, covert influence operations, and other content and behaviour that can increase during an election period. In advance of the election, we had proactive data monitoring, trend detection, and regular monitoring of election keywords and accounts.


(II) Mission Control Centre: internal cross-functional collaboration


As part of our advance preparations, ahead of the Czech election, we established a dedicated Mission Control Centre (MCC) bringing together employees from multiple specialist teams within our safety department. Through the MCC, our teams provided consistent and dedicated coverage of potential election-related issues in the run-up to, and during, the election.

(III) Integrity and Authenticity policies

We prioritise proactive content moderation, with the vast majority of violative content removed before it is viewed or reported.

(IV) Fact-checking


Our global fact-checking programme is a critical part of our layered approach to detecting harmful misinformation in the context of elections. The core objective of the fact-checking program is to leverage the expertise of external fact-checking organisations to help assess the accuracy of potentially harmful claims that are difficult to verify.


Within Europe, we partner with 13 fact-checking organisations who provide fact-checking coverage in 25 languages (22 official EU languages plus Russian, Ukrainian and Turkish). Lead Stories serves as the fact-checking partner for Czechia.


(V) Deterring covert influence operations


We prohibit covert influence operations and remain constantly vigilant against attempts to use deceptive behaviours and manipulate our platform. We proactively seek and continuously investigate leads for potential influence operations. We're also working with government authorities and encourage them to share any intelligence so that we can work together to ensure election integrity. More detail on our policy against covert influence operations is published on our website.


(VI) Tackling misleading AI-generated content


Creators are required to label any realistic AI-generated content (AIGC) and we have an AI-generated content label to help people do this. TikTok has a ‘Edited Media and AI-Generated Content (AIGC)’ policy, which prohibits AIGC showing fake authoritative sources or crisis events, or falsely showing public figures in certain contexts including being bullied, making an endorsement, or being endorsed.


(VII) Government, Politician, and Political Party Accounts (GPPPAs)


Many political leaders, ministers, and political parties have a presence on TikTok.These politicians and parties play an important role on our platform - we believe that verified accounts belonging to politicians and institutions provide the electorate with another route to access their representatives, and additional trusted voices in the shared fight against misinformation.


We strongly recommend GPPPAs have their accounts verified by TikTok. Verified badges help users make informed choices about the accounts they choose to follow. It is also an easy way for notable figures to let users know they’re seeing authentic content, and it helps to build trust among high-profile accounts and their followers.

Directing people to trusted sources

(I) Investing in media literacy

We invest in media literacy campaigns as a counter-misinformation strategy. We engaged with the local fact-checking Demagog.cz to develop, review, and launch two videos as part of a media literacy campaign.


External engagement at the national and EU levels


(I) Rapid Response System: external collaboration with COCD Signatories 


The COCD Rapid Response System (RRS) was utilised to  exchange information among civil society organisations, fact-checkers, and online platforms. TikTok received 1 RRS. Throughout the election period, the team maintained consistent prioritisation of RRS requests and ensured timely, accurate support for cross-functional partners.


(II) Engagement with local experts


To further promote election integrity, and inform our approach to the Czech election, we organised an Election Speaker Series with our local fact-checking partner, LeadStories, who shared their insights and market expertise with our internal teams. 

Netherlands election 2025:

(I) Monitoring capabilities


We have dedicated Trust and Safety professionals working to keep our platform safe. As they usually do, our teams worked alongside technology to ensure that we were consistently enforcing our rules to detect and remove misinformation, covert influence operations, and other content and behaviour that can increase during an election period. In advance of the election, we had proactive data monitoring, trend detection, and regular monitoring of election keywords and accounts.


(II) Mission Control Centre: internal cross-functional collaboration


As part of our advance preparations, ahead of the Dutch election, we established a dedicated Mission Control Centre (MCC) bringing together employees from multiple specialist teams within our safety department. Through the MCC, our teams provided consistent and dedicated coverage of potential election-related issues in the run-up to, and during, the election.


(III) Integrity and Authenticity policies


We prioritise proactive content moderation, with the vast majority of violative content removed before it is viewed or reported.


(IV) Fact-checking


Our global fact-checking programme is a critical part of our layered approach to detecting harmful misinformation in the context of elections. The core objective of the fact-checking program is to leverage the expertise of external fact-checking organisations to help assess the accuracy of potentially harmful claims that are difficult to verify.


Within Europe, we partner with 13 fact-checking organisations who provide fact-checking coverage in 25 languages (22 official EU languages plus Russian, Ukrainian and Turkish). Deutsche Presse-Agentur (dpa) serves as the fact-checking partner for the Netherlands.


(V) Deterring covert influence operations


We prohibit covert influence operations and remain constantly vigilant against attempts to use deceptive behaviours and manipulate our platform. We proactively seek and continuously investigate leads for potential influence operations. We're also working with government authorities and encouraging them to share any intelligence so that we can work together to ensure election integrity. More detail on our policy against covert influence operations is published on our website.


(VI) Tackling misleading AI-generated content


Creators are required to label any realistic AI-generated content (AIGC) and we have an AI-generated content label to help people do this. TikTok has a ‘Edited Media and AI-Generated Content (AIGC)’ policy, which prohibits AIGC showing fake authoritative sources or crisis events, or falsely showing public figures in certain contexts including being bullied, making an endorsement, or being endorsed.


(VII) Government, Politician, and Political Party Accounts (GPPPAs)


Many political leaders, ministers, and political parties have a presence on TikTok.These politicians and parties play an important role on our platform - we believe that verified accounts belonging to politicians and institutions provide the electorate with another route to access their representatives, and additional trusted voices in the shared fight against misinformation.


We strongly recommend GPPPAs have their accounts verified by TikTok. Verified badges help users make informed choices about the accounts they choose to follow. It is also an easy way for notable figures to let users know they’re seeing authentic content, and it helps to build trust among high-profile accounts and their followers.


Directing people to trusted sources


(I) Investing in media literacy


We invest in media literacy campaigns as a counter-misinformation strategy.


External engagement at the national and EU levels


(I) Rapid Response System: external collaboration with COCD Signatories 


The COCD Rapid Response System (RRS) was utilised to  exchange information among civil society organisations, fact-checkers, and online platforms. TikTok received 1 RRS with the content violating our AIGC policies. Throughout the election period, the team maintained consistent prioritisation of RRS requests and ensured timely, accurate support for cross-functional partners.


(II) Engagement with local experts


To further promote election integrity, and inform our approach to the Dutch election, we organised an Election Speaker Series with our fact-checking partner, dpa,who shared their insights and market expertise with our internal teams. 


Policies and Terms and Conditions
Outline any changes to your policies
N/A