Polish Elections:
(I) Moderation capabilities
We have thousands of trust and safety professionals dedicated to keeping our platform safe. As they usually do, our teams worked alongside technology to ensure that we were consistently
enforcing our rules to detect and remove misinformation, covert influence operations, and other content and behaviour that can increase during an election period. In advance of the election we had proactive data monitoring, trend detection and regular monitoring of enriched keywords and accounts.
(II) Mission Control Centre: internal cross-functional collaboration
TikTok established a Mission Control Centre (MCC) in advance of the election, developed risk scenario mapping (covering focused Russian influence operations, AI-generated content (AIGC), misinformation/disinformation, scaled inauthentic behavior, hate speech surges), and implemented regular content trend clustering with rolling containment-correction-and-prevention cycle, covering key features. As a result, all identified threats were contained or mitigated early, with no credible or substantiated election interference claims emerging.
(III) Countering misinformation
Our misinformation moderators receive enhanced training and tools to detect and remove misinformation and other violative content. We also have teams on the ground who partner with experts to ensure local context and nuance is reflected in our approach.
In the weeks leading up to and including the run-off, we removed 530 videos for violating our civic and election integrity policies, and 2,772 videos for violating our misinformation policies.
(IV) Fact-checking
Our global fact-checking programme is a critical part of our layered approach to detecting harmful misinformation in the context of elections. The core objective of the fact-checking program is to leverage the expertise of external fact-checking organisations to help assess the accuracy of potentially harmful claims that are difficult to verify.
Within Europe, we partnered with 12 fact-checking organisations who provide fact-checking coverage in 25 languages (22 official EU languages plus Russian, Ukrainian and Turkish).
Demagog, serves as the fact-checking partner for Poland, which provides coverage for the platform.
(V) Deterring covert influence operations
We prohibit covert influence operations and remain constantly vigilant against attempts to use deceptive behaviours and manipulate our platform. We proactively seek and continuously investigate leads for potential influence operations. We're also working with government authorities and encourage them to share any intelligence so that we can work together to ensure election integrity. More detail on our policy against covert influence operations is published on our
website.
(VI) Tackling misleading AI-generated content
Creators are required to label any realistic AI-generated content (AIGC) and we have an
AI-generated content label to help people do this. TikTok has a ‘Edited Media and AI-Generated Content (AIGC)’ policy, which prohibits AIGC showing fake authoritative sources or crisis events, or falsely showing public figures in certain contexts including being bullied, making an endorsement, or being endorsed.
(VII) Government, Politician, and Political Party Accounts (GPPPAs)
Many political leaders, ministers, and political parties have a presence on TikTok.These politicians and parties play an important role on our platform - we believe that verified accounts belonging to politicians and institutions provide the electorate with another route to access their representatives, and additional trusted voices in the shared fight against misinformation.
We strongly recommend GPPPAs have their accounts
verified by TikTok. Verified badges help users make informed choices about the accounts they choose to follow. It is also an easy way for notable figures to let users know they’re seeing authentic content, and it helps to build trust among high-profile accounts and their followers.
Directing people to trusted sources
(I) Investing in media literacy
We invest in media literacy campaigns as a counter-misinformation strategy, working with fact checkers as part of our Election Centre for Poland. TikTok has partnered with Demagog &
FakeNews.pl in Poland to help the community safely navigate the platform and protect themselves against potential misinformation during the elections. We also worked with fact checkers to launch an Evergreen Media Literacy Campaign.
External engagement at the national and EU levels
(I) Rapid Response System: external collaboration with COPD Signatories
The COCD Rapid Response System (RRS) was utilised to exchange information among civil society organisations, fact-checkers, and online platforms. TikTok received 23 RRS reports through the RRS before the Polish Election, which were rapidly addressed, including
NASK and DSA cases. Actions included banning of accounts and content removals for violation of Community Guidelines.
(II) Engagement with local experts
To further promote election integrity, and inform our approach to the Polish Election, we organised an Election Speaker Series with Demagog who shared their insights and market expertise with our internal teams.
German Elections:
Enforcing our policies
(I) Moderation capabilities
We have thousands of trust and safety professionals dedicated to keeping our platform safe. As they usually do, our teams worked alongside technology to ensure that we were consistently
enforcing our rules to detect and remove misinformation, covert influence operations, and other content and behaviour that can increase during an election period. In advance of the election, we had proactive data monitoring, trend detection and regular monitoring of enriched keywords and accounts.
(II) Mission Control Centre: internal cross-functional collaboration
On 18 November , ahead of the German election , we established a dedicated Mission Control Centre (MCC) bringing together employees from multiple specialist teams within our safety department. Through the MCC, our teams were able to provide consistent and dedicated coverage of potential election-related issues in the run-up to, and during, the election.
(III) Countering misinformation
Our misinformation moderators receive enhanced training and tools to detect and remove misinformation and other violative content. We also have teams on the ground who partner with experts to ensure local context and nuance is reflected in our approach.
In the weeks leading up to and including the election, we removed 3,283 videos for violating our civic and election integrity policies, and 12,781 videos for violating our misinformation policies.
(IV) Fact-checking
Our global fact-checking programme is a critical part of our layered approach to detecting harmful misinformation in the context of elections. The core objective of the fact-checking program is to leverage the expertise of external fact-checking organisations to help assess the accuracy of potentially harmful claims that are difficult to verify.
(V) Deterring covert influence operations
We prohibit covert influence operations and remain constantly vigilant against attempts to use deceptive behaviours and manipulate our platform. We proactively seek and continuously investigate leads for potential influence operations. We're also working with government authorities and encourage them to share any intelligence so that we can work together to ensure election integrity. More detail on our policy against covert influence operations is published on our
website as well as monthly
Covert Influence Operations reports.
(VI) Tackling misleading AI-generated content
Creators are required to label any realistic AI-generated content (AIGC) and we have an
AI-generated content label to help people do this. TikTok has a ‘Edited Media and AI-Generated Content (AIGC)’ policy, which prohibits AIGC showing fake authoritative sources or crisis events, or falsely showing public figures in certain contexts including being bullied, making an endorsement, or being endorsed.
(VII) Government, Politician, and Political Party Accounts (GPPPAs)
Many political leaders, ministers, and political parties have a presence on TikTok.These politicians and parties play an important role on our platform - we believe that verified accounts belonging to politicians and institutions provide the electorate with another route to access their representatives, and additional trusted voices in the shared fight against misinformation.
We strongly recommend GPPPAs have their accounts
verified by TikTok. Verified badges help users make informed choices about the accounts they choose to follow. It is also an easy way for notable figures to let users know they’re seeing authentic content, and it helps to build trust among high-profile accounts and their followers.
Before the German election, we provided all parties represented in federal and state parliaments with written information about our election integrity policies and measures, and offered virtual information sessions for the parties and their candidates. We presented at security-focused webinar for candidates and parties organised by the Federal Office for Information Security (BSI). We also offered all parties represented in federal and state parliaments verification support for their candidates.
Directing people to trusted sources
(I) Investing in media literacy
We invest in media literacy campaigns as a counter-misinformation strategy. From 16 Dec 2024 to 3 Mar 2025, we launched an in-app
Election Centre to provide users with up-to-date information about the 2025 German federal election. The centre contained a section about spotting misinformation, which included videos created in partnership with the fact-checking organisation
Deutsche Presse-Agentur (dpa). The Election Center was visited more than 5.7 million times.
External engagement at the national and EU levels
(I) Rapid Response System: external collaboration with COPD Signatories
The COPD Rapid Response System (RRS) was utilised to exchange information among civil society organisations, fact-checkers, and online platforms. TikTok received 4 RRS reports through the RRS before the German election which were rapidly addressed. Actions included banning of accounts and content removals for violation of Community Guidelines.
(II) Engagement with local experts
To further promote election integrity, and inform our approach to the German Election, we organised an Election Speaker Series with dpa who shared their insights and market expertise with our internal teams
(III) Engagement with national authorities and stakeholders
We participated in the two election roundtables hosted by the Federal Ministry of the Interior (BMI), one before and one after the election.
We participated in the election roundtable as well as the stress test hosted by the Federal Network Agency (BNetzA), the German Digital Service Coordinator (DSC). In addition, we held three separate virtual meetings between TikTok and the BNetzA, also attended by the European Commission, and answered a set of written questions.
We met with the domestic intelligence service (BfV) and the BMI state secretary.
We attended two election-focused virtual meetings with BzKJ (Federal Agency for Child and Youth Protection) and other platforms.
We engaged with the electoral commissioner ("Bundeswahlleiterin") and onboarded them to TikTok. In our election center, we included 2 videos from the electoral commissioner and linked to their website.
We provided all parties represented in federal and state parliaments with information about our election integrity measures and what they/their candidates can and cannot do on the platform in written form and also offered virtual info sessions for the parties and their candidates. We also offered all parties represented in federal and state parliaments verification support for their candidates.
We presented a security-focused webinar for candidates and parties organised by the Federal Office for Information Security (BSI).
Portuguese Elections:
(I) Moderation capabilities
We have thousands of trust and safety professionals dedicated to keeping our platform safe. As they usually do, our teams worked alongside technology to ensure that we were consistently
enforcing our rules to detect and remove misinformation, covert influence operations, and other content and behaviour that can increase during an election period. In advance of the election, we had proactive data monitoring, trend detection, and regular monitoring of enriched keywords and accounts.
(II) Mission Control Centre: internal cross-functional collaboration
On 13 May, ahead of the Portuguese election, we established a dedicated Mission Control Centre (MCC) bringing together employees from multiple specialist teams within our safety department. Through the MCC, our teams were able to provide consistent and dedicated coverage of potential election-related issues in the run-up to, and during, the election. .
(III) Countering misinformation
Our misinformation moderators receive enhanced training and tools to detect and remove misinformation and other violative content. We also have teams on the ground who partner with experts to ensure local context and nuance is reflected in our approach.
In the weeks leading up to and including the election (April 21 to May 18), we removed 821 pieces of content for violating our policies on
civic and election integrity,
misinformation, and
AI generated content. In this same period, we removed over
99% of violative misinformation content before it was reported to us.
(IV) Fact-checking
Our global fact-checking programme is a critical part of our layered approach to detecting harmful misinformation in the context of elections. The core objective of the fact-checking program is to leverage the expertise of external fact-checking organisations to help assess the accuracy of potentially harmful claims that are difficult to verify.
TikTok collaborates with
12 fact-checking organizations across Europe to evaluate the accuracy of content in most European languages, including Portuguese.
Poligrafo , serves as the fact-checking partner for Portugal, which provides coverage for the platform.
(V) Deterring covert influence operations
We prohibit covert influence operations and remain constantly vigilant against attempts to use deceptive behaviours and manipulate our platform. We proactively seek and continuously investigate leads for potential influence operations. We're also working with government authorities and encourage them to share any intelligence so that we can work together to ensure election integrity. More detail on our policy against covert influence operations is published on our
website as well as monthly
Covert Influence Operations reports.
(VI) Tackling misleading AI-generated content
Creators are required to label any realistic AI-generated content (AIGC) and we have an
AI-generated content label to help people do this. TikTok has a ‘Edited Media and AI-Generated Content (AIGC)’ policy, which prohibits AIGC showing fake authoritative sources or crisis events, or falsely showing public figures in certain contexts including being bullied, making an endorsement, or being endorsed.
(VII) Government, Politician, and Political Party Accounts (GPPPAs)
Many political leaders, ministers, and political parties have a presence on TikTok.These politicians and parties play an important role on our platform - we believe that verified accounts belonging to politicians and institutions provide the electorate with another route to access their representatives, and additional trusted voices in the shared fight against misinformation.
We strongly recommend GPPPAs have their accounts
verified by TikTok. Verified badges help users make informed choices about the accounts they choose to follow. It is also an easy way for notable figures to let users know they’re seeing authentic content, and it helps to build trust among high-profile accounts and their followers.
Before the election we met with the main Portuguese regulatory bodies and political parties' Heads of Communication to (i) provide an overview of TikTok's policies for political accounts, (ii) outline TikTok's approach to election integrity and to data security, (iii) encourage account verification and (iv) enable direct contact to respond to their specific requests.
Directing people to trusted sources
(I) Investing in media literacy
We invest in media literacy campaigns as a counter-misinformation strategy. From 18 Apr 2025 to 2 June 2025, we launched an in-app
Election Centre to provide users with up-to-date information about the 2025 Portugal election. The centre contained a section about spotting misinformation, which included videos created in partnership with our fact-checking partner Poligrafo. TikTok has partnered with Poligrafo in Portugal to help the community safely navigate the platform and protect themselves against potential misinformation during the elections. Poligrafo developed a series of educational videos explaining how users could identify and avoid misinformation, use TikTok’s safety features, and critically evaluate content related to the electoral process. The Portuguese community could find the video series with practical advice and useful information about the electoral process in the relevant Election Center.
External engagement at the national and EU levels
(I) Rapid Response System: external collaboration with COPD Signatories
The COCD Rapid Response System (RRS) was utilised to exchange information among civil society organisations, fact-checkers, and online platforms. TikTok received 1 RRS report through the RRS during the Portuguese election, which was quickly addressed and resulted in the reported content being deemed
“FYF Ineligible”.
(II) Engagement with local experts
To further promote election integrity, and inform our approach to the Portuguese election, we organised an Election Speaker Series with Poligrafo who shared their insights and market expertise with our internal teams.
(III) Engagement with national authorities and stakeholders
Ahead of the election, our Government Relations team represented TikTok at an official meeting organised by ANACOM with the Portuguese Regulatory Authority for the Media (ERC) and the National Election Commission (CNE). The team also met with the Organization for Security and Cooperation in Europe’s Office of Democratic Institutions and Human Rights (OSCE/ODIHR) and in particular, their Election Expert Team (EET) deployed for these elections.
As previously referenced, we also met with Portuguese political parties’ Heads of Communication to (i) provide an overview of TikTok's policies for political accounts, (ii) outline TikTok's approach to election integrity and to data security, (iii) encourage account verification and (iv) enable direct contact to respond to their specific requests.
Romanian Elections:
(I) Moderation capabilities
We supported the Romania 2025 elections by preparing moderators, updating policy, and escalating hate organization content in time for both election rounds. Our teams worked alongside technology to ensure that we consistently
enforced our rules to detect and remove misinformation, covert influence operations, and other content and behaviour that can increase during an election period. We continue to prioritize and enhance TikTok's automated moderation technology as such technology enables faster and consistent removal of content that violates our rules. We invest in technologies that improve content understanding and predict potential risks so that we can take action on violative content before it's viewed.
We have thousands of trust and safety professionals dedicated to keeping our platform safe. We have 95 Romanian-speaking moderators, which is the largest such team among digital platforms in the country, both in absolute terms and relative to the number of users. We increased resources on our Romanian elections task force by adding more than 120 subject matter experts across multiple teams including Deceptive Behaviour (which includes Covert Influence Operations analysts), Security and Ads Integrity.
(II) Mission Control Centre: internal cross-functional collaboration
In advance of the official campaign period for the Romanian Presidential Election, we established a dedicated Mission Control Centre (MCC), including employees from multiple specialist teams within our safety department. Through the MCC, our teams were able to provide consistent and dedicated coverage of potential election-related issues in the run-up to, and during, the Romanian Presidential Election.
(III) Countering misinformation
Our misinformation moderators receive enhanced training and tools to detect and remove misinformation and other violative content. We also have teams on the ground who partner with experts to ensure local context and nuance are reflected in our approach. We also integrated the most recent insights from our expert partners into our policies and guidelines on misinformation and impersonation. We removed more than 5,500 pieces of election-related content in Romania for violating our policies on misinformation, harassment, and hate speech between March and May 2025.
(IV) Fact-checking
Our global fact-checking programme is a critical part of our layered approach to detecting harmful misinformation in the context of elections. The core objective of the fact-checking program is to leverage the expertise of external fact-checking organisations to help assess the accuracy of potentially harmful claims that are difficult to verify. TikTok collaborates with
12 fact-checking organizations across Europe to evaluate the accuracy of content in most European languages, including Romanian.
LeadStories, which is a verified member of International Fact-Checking Network and the European Fact-Checking Standards Network, serves as the fact-checking partner for Romania, which provided coverage for the platform, including across weekends.