Ireland Elections 2025:
Enforcing our policies
(I) Monitoring capabilities
We have dedicated Trust and Safety professionals working to keep our platform safe. As they usually do, our teams worked alongside technology to ensure that we were consistently
enforcing our rules to detect and remove misinformation, covert influence operations, and other content and behaviour that can increase during an election period. In advance of the election, we had proactive data monitoring, trend detection and regular monitoring of enriched keywords and accounts.
(II) Mission Control Centre: internal cross-functional collaboration
As part of our advance preparations ahead of the Irish presidential election, we established a dedicated Mission Control Centre (MCC) bringing together employees from multiple specialist teams within our safety department. Through the MCC, our teams were able to provide consistent and dedicated coverage of potential election-related issues in the run-up to, during,and immediately after the election.
(III) Integrity and Authenticity policies
We prioritise proactive content moderation, with the vast majority of violative content removed before it is viewed or reported. In H2 2025, more than 98% of videos violating our Integrity and Authenticity policies were removed proactively worldwide.
(IV) Fact-checking
Our global fact-checking programme is a critical part of our layered approach to detecting harmful misinformation in the context of elections. The core objective of the fact-checking program is to leverage the expertise of external fact-checking organisations to help assess the accuracy of potentially harmful claims that are difficult to verify.
Within Europe, we partner with 13 fact-checking organisations who provide fact-checking coverage in 25 languages (22 official EU languages plus Russian, Ukrainian and Turkish).
Reuters serves as the fact-checking partner for Ireland.
(V) Deterring covert influence operations
We prohibit covert influence operations and remain constantly vigilant against attempts to use deceptive behaviours and manipulate our platform. We proactively seek and continuously investigate leads for potential influence operations. We're also working with government authorities and encourage them to share any intelligence so that we can work together to ensure election integrity. More detail on our policy against covert influence operations is published on our
website.
(VI) Tackling misleading AI-generated content
Creators are required to label any realistic AI-generated content (AIGC) and we have an
AI-generated content label to help people do this. TikTok has a ‘Edited Media and AI-Generated Content (AIGC)’ policy, which prohibits AIGC showing fake authoritative sources or crisis events, or falsely showing public figures in certain contexts including being bullied, making an endorsement, or being endorsed.
(VII) Government, Politician, and Political Party Accounts (GPPPAs)
Many political leaders, ministers, and political parties have a presence on TikTok.These politicians and parties play an important role on our platform - we believe that verified accounts belonging to politicians and institutions provide the electorate with another route to access their representatives, and additional trusted voices in the shared fight against misinformation.
We strongly recommend GPPPAs have their accounts
verified by TikTok. Verified badges help users make informed choices about the accounts they choose to follow. It is also an easy way for notable figures to let users know they’re seeing authentic content, and it helps to build trust among high-profile accounts and their followers.
Directing people to trusted sources
(I) Investing in media literacy
We invest in media literacy campaigns as a counter-misinformation strategy. From 24 Sept to 25 Oct 2025, we launched an in-app
Election Centre to provide users with up-to-date information about the 2025 Irish presidential election. The centre contained a section about spotting misinformation, which included videos created in partnership with
The Journal's fact-checking unit.
External engagement at the national and EU levels
(I) Rapid Response System: external collaboration with COPD Signatories
Throughout the election period, our teams maintained communication with XFNs as part of the COCD Rapid Response System (RRS). We received 10 reports via the RRS related to AIGC, misinformation and impersonation, which were rapidly addressed. Actions included banning of accounts and content removals for violation of Community Guidelines.
(II) Engagement with local experts
To further promote election integrity, and inform our approach to the Irish Election, we organised an Election Speaker Series with local fact-checking partner Reuters who shared their insights and market expertise with our internal teams.
Czech Federal Elections:
Enforcing our policies
(I) Monitoring capabilities
We have dedicated Trust and Safety professionals working to keep our platform safe. As they usually do, our teams worked alongside technology to ensure that we were consistently
enforcing our rules to detect and remove misinformation, covert influence operations, and other content and behaviour that can increase during an election period. In advance of the election, we had proactive data monitoring, trend detection, and regular monitoring of election keywords and accounts.
(II) Mission Control Centre: internal cross-functional collaboration
As part of our advance preparations, ahead of the Czech election, we established a dedicated Mission Control Centre (MCC) bringing together employees from multiple specialist teams within our safety department. Through the MCC, our teams provided consistent and dedicated coverage of potential election-related issues in the run-up to, and during, the election.
(III) Integrity and Authenticity policies
We prioritise proactive content moderation, with the vast majority of violative content removed before it is viewed or reported.
(IV) Fact-checking
Our global fact-checking programme is a critical part of our layered approach to detecting harmful misinformation in the context of elections. The core objective of the fact-checking program is to leverage the expertise of external fact-checking organisations to help assess the accuracy of potentially harmful claims that are difficult to verify.
Within Europe, we partner with 13 fact-checking organisations who provide fact-checking coverage in 25 languages (22 official EU languages plus Russian, Ukrainian and Turkish). Lead Stories serves as the fact-checking partner for Czechia.
(V) Deterring covert influence operations
We prohibit covert influence operations and remain constantly vigilant against attempts to use deceptive behaviours and manipulate our platform. We proactively seek and continuously investigate leads for potential influence operations. We're also working with government authorities and encourage them to share any intelligence so that we can work together to ensure election integrity. More detail on our policy against covert influence operations is published on our
website.
(VI) Tackling misleading AI-generated content
Creators are required to label any realistic AI-generated content (AIGC) and we have an
AI-generated content label to help people do this. TikTok has a ‘Edited Media and AI-Generated Content (AIGC)’ policy, which prohibits AIGC showing fake authoritative sources or crisis events, or falsely showing public figures in certain contexts including being bullied, making an endorsement, or being endorsed.
(VII) Government, Politician, and Political Party Accounts (GPPPAs)
Many political leaders, ministers, and political parties have a presence on TikTok.These politicians and parties play an important role on our platform - we believe that verified accounts belonging to politicians and institutions provide the electorate with another route to access their representatives, and additional trusted voices in the shared fight against misinformation.
We strongly recommend GPPPAs have their accounts
verified by TikTok. Verified badges help users make informed choices about the accounts they choose to follow. It is also an easy way for notable figures to let users know they’re seeing authentic content, and it helps to build trust among high-profile accounts and their followers.
Directing people to trusted sources
(I) Investing in media literacy
We invest in media literacy campaigns as a counter-misinformation strategy. We engaged with the local fact-checking Demagog.cz to develop, review, and launch two videos as part of a media literacy campaign.
External engagement at the national and EU levels
(I) Rapid Response System: external collaboration with COCD Signatories
The COCD Rapid Response System (RRS) was utilised to exchange information among civil society organisations, fact-checkers, and online platforms. TikTok received 1 RRS. Throughout the election period, the team maintained consistent prioritisation of RRS requests and ensured timely, accurate support for cross-functional partners.
(II) Engagement with local experts
To further promote election integrity, and inform our approach to the Czech election, we organised an Election Speaker Series with our local fact-checking partner, LeadStories, who shared their insights and market expertise with our internal teams.
Netherlands election 2025:
(I) Monitoring capabilities
We have dedicated Trust and Safety professionals working to keep our platform safe. As they usually do, our teams worked alongside technology to ensure that we were consistently
enforcing our rules to detect and remove misinformation, covert influence operations, and other content and behaviour that can increase during an election period. In advance of the election, we had proactive data monitoring, trend detection, and regular monitoring of election keywords and accounts.
(II) Mission Control Centre: internal cross-functional collaboration
As part of our advance preparations, ahead of the Dutch election, we established a dedicated Mission Control Centre (MCC) bringing together employees from multiple specialist teams within our safety department. Through the MCC, our teams provided consistent and dedicated coverage of potential election-related issues in the run-up to, and during, the election.
(III) Integrity and Authenticity policies
We prioritise proactive content moderation, with the vast majority of violative content removed before it is viewed or reported.
(IV) Fact-checking
Our global fact-checking programme is a critical part of our layered approach to detecting harmful misinformation in the context of elections. The core objective of the fact-checking program is to leverage the expertise of external fact-checking organisations to help assess the accuracy of potentially harmful claims that are difficult to verify.
Within Europe, we partner with 13 fact-checking organisations who provide fact-checking coverage in 25 languages (22 official EU languages plus Russian, Ukrainian and Turkish). Deutsche Presse-Agentur (dpa) serves as the fact-checking partner for the Netherlands.
(V) Deterring covert influence operations
We prohibit covert influence operations and remain constantly vigilant against attempts to use deceptive behaviours and manipulate our platform. We proactively seek and continuously investigate leads for potential influence operations. We're also working with government authorities and encouraging them to share any intelligence so that we can work together to ensure election integrity. More detail on our policy against covert influence operations is published on our
website.
(VI) Tackling misleading AI-generated content
Creators are required to label any realistic AI-generated content (AIGC) and we have an
AI-generated content label to help people do this. TikTok has a ‘Edited Media and AI-Generated Content (AIGC)’ policy, which prohibits AIGC showing fake authoritative sources or crisis events, or falsely showing public figures in certain contexts including being bullied, making an endorsement, or being endorsed.
(VII) Government, Politician, and Political Party Accounts (GPPPAs)
Many political leaders, ministers, and political parties have a presence on TikTok.These politicians and parties play an important role on our platform - we believe that verified accounts belonging to politicians and institutions provide the electorate with another route to access their representatives, and additional trusted voices in the shared fight against misinformation.
We strongly recommend GPPPAs have their accounts
verified by TikTok. Verified badges help users make informed choices about the accounts they choose to follow. It is also an easy way for notable figures to let users know they’re seeing authentic content, and it helps to build trust among high-profile accounts and their followers.
Directing people to trusted sources
(I) Investing in media literacy
We invest in media literacy campaigns as a counter-misinformation strategy.
External engagement at the national and EU levels
(I) Rapid Response System: external collaboration with COCD Signatories
The COCD Rapid Response System (RRS) was utilised to exchange information among civil society organisations, fact-checkers, and online platforms. TikTok received 1 RRS with the content violating our AIGC policies. Throughout the election period, the team maintained consistent prioritisation of RRS requests and ensured timely, accurate support for cross-functional partners.
(II) Engagement with local experts
To further promote election integrity, and inform our approach to the Dutch election, we organised an Election Speaker Series with our fact-checking partner, dpa,who shared their insights and market expertise with our internal teams.