TikTok

Report September 2025

Submitted
TikTok’s mission is to inspire creativity and bring joy. With a global community of more than a billion users, it’s natural for people to hold different opinions. That’s why we focus on a shared set of facts when it comes to issues that affect people’s safety. A safe, authentic, and trustworthy experience is essential to achieving our goals. Transparency plays a key role in building that trust, allowing online communities and society to assess how TikTok meets its regulatory obligations. As a signatory to the Code of Conduct on Disinformation (the Code), TikTok is committed to sharing clear insights into the actions we take.

TikTok takes disinformation extremely seriously. We are committed to preventing its spread, promoting authoritative information, and supporting media literacy initiatives that strengthen community resilience.

We prioritise proactive content moderation, with the vast majority of violative content removed before it is viewed or reported. In H1 2025, more than 97% of videos violating our Integrity and Authenticity policies were removed proactively worldwide.

We continue to address emerging behaviours and risks through our Digital Services Act (DSA) compliance programme, which the Code has operated under since July 2025. This includes a range of measures to protect users, detailed on our European Online Safety Hub. Our actions under the Code demonstrate TikTok’s strong commitment to combating disinformation while ensuring transparency and accountability to our community and regulators.

Our full executive summary can be read by downloading our report using the link below.

Download PDF

Elections 2025
[Note: Signatories are requested to provide information relevant to their particular response to the threats and challenges they observed on their service(s). They ensure that the information below provides an accurate and complete report of their relevant actions. As operational responses to crisis/election situations can vary from service to service, an absence of information should not be considered a priori a shortfall in the way a particular service has responded. Impact metrics are accurate to the best of signatories’ abilities to measure them].
Threats observed or anticipated
Polish Elections:
The 2025 Polish Presidential Election was a high-risk election with significant negative exposure potential. Round 1 elections occurred on 18 May, and the run-off was held on 1 June. Official results were announced on 2 June. Because of its significance in the context of Poland's domestic policies and international relations, we activated our Mission Control Centre (MCC) work in advance of the election, which resulted in identifying and containing threats early and quickly. Regulators publicly praised TikTok’s collaboration, and national media highlighted TikTok "more ambitious" safety posture compared to rival platforms.Some examples of the violative content we successfully disrupted include:
  • Content removals: We proactively removed more than 3,300 pieces of election-related content in Poland for violating our policies on synthetic and manipulated media, misinformation, and civic and election integrity.
  • Covert influence disruption: We removed three new domestic CIO networks (totaling 77 accounts and 36,419 followers) that were identified as specifically targeting a Polish audience for manipulating election discourse using fake news accounts and personas. More information relating to network disruptions is published on our dedicated Covert Influence Operations Reports. 

German Elections:
We have comprehensive measures in place to anticipate and address the risks associated with electoral processes, including the risks associated with election misinformation in the context of the German federal election held on 23 February 2025 . In advance of the election, a core election team was formed and consultations between cross function teams helped to identify and design response strategies.

TikTok did not observe major threats during the German election. Some examples of the violative content we successfully disrupted in German during January 2025:
  • We removed more than 862,000 pieces of content for violating our Community Guidelines, which includes our policies on civic and election integrity and misinformation.
  • We also removed 712 accounts for impersonating German election candidates and elected officials.
  • We proactively prevented +24 million fake likes and +18.9 million fake follow requests. We also blocked +293,000 spam accounts from being created.
  • We also removed +700,000 fake accounts, +17 million fake likes, and +5.7 million fake followers.

Portuguese Elections:
We have comprehensive measures in place to anticipate and address the risks associated with electoral processes, including the risks associated with election misinformation in the context of the Portugal legislative election held on 18 May 2025. In advance of the election, a core election team was formed and consultations between cross function teams helped to identify and design response strategies.

TikTok did not observe major threats during the Portuguese election. Through the election, we monitored for and actioned inauthentic behavior, and removed content that violated our Community Guidelines. As part of these efforts:
  • Between May 12 and May 25, we removed more than 300 pieces of content for violating our policies on civic and election integrity, misinformation and AI generated content. We removed more than 94% of it before anyone told us about it.
  • Between May 12 and 25, we proactively prevented more than 1,800,000 fake likes and more than 671,000 fake follow requests, and blocked more than 5,400 spam accounts from being created in Portugal. We also removed more than 5,400 fake accounts, more than 880,000 fake likes, and more than 154,000 fake followers.
  • Between May 15 - May 29, we also removed 28 accounts for impersonating Portuguese election candidates and elected officials.

Romanian Election:
As co-chair of the Code of Conduct on Disinformation's Working Group on elections, TikTok takes our role of protecting the integrity of elections on our platform very seriously. We have comprehensive measures in place to anticipate and address the risks associated with electoral processes, including the risks associated with election misinformation in the context of the Romanian Presidential Election, which took place on 4 May 2025, with a second round on 18 May 2025, following the unprecedented annulment of the 2024 results and marked one of the most closely monitored electoral cycles for TikTok to date.

From March to May 2025, TikTok deployed robust detection models, automated moderation, and local partnerships to safeguard its Romanian user base of over 8 million. The following are examples of some of the threats TikTok observed in relation to both election rounds:

  • Covert influence disruption: TikTok reported removing two new domestic covert networks totaling 87 accounts and 33,296 followers)in April 2025  for manipulating election discourse using fake news accounts and personas. More information relating to the network disruptions is published on our dedicated Covert Influence Operations transparency page
  • Content removals: We removed over 13,100 pieces of election-related content in Romania for violating our policies on misinformation, civic integrity, and synthetic media - over 93% were taken down before any user report.
  • We received 57 submissions through the COCD Rapid Response System in relation to the Romanian Presidential Election, which were rapidly addressed. Actions included banning or geo-blocking of accounts and content removals for violation of Community Guidelines.

(V) Deterring covert influence operations 

We prohibit covert influence operations and remain constantly vigilant against attempts to use deceptive behaviours and manipulate our platform. We proactively seek and continuously investigate leads for potential influence operations. We're also working with government authorities and encourage them to share any intelligence so that we can work together to ensure election integrity. More detail on our policy against covert influence operations is published on our website.
 
(VI) Tackling misleading AI-generated content 

Creators are required to label any realistic AI-generated content (AIGC) and we have an AI-generated content label to help people do this. TikTok has a  ‘Edited Media and AI-Generated Content (AIGC)’ policy, which prohibits AIGC showing fake authoritative sources or crisis events, or falsely showing public figures in certain contexts including being bullied, making an endorsement, or being endorsed.

(VII) Government, Politician, and Political Party Accounts (GPPPAs)

We classify presidential candidate accounts as a Government, Politician, and Political Party Account (GPPPA). We then apply designated policies to GPPPAs to ensure the right experience, given their important role in civic processes. This includes disabling monetisation features.

We strongly recommend that GPPPAs be verified. Verified badges help users make informed choices about the accounts they choose to follow. It is also an easy way for notable figures to let users know they’re seeing authentic content, and it helps to build trust among high-profile accounts and their followers.

In advance of the elections TikTok’s GR team organized dedicated sessions with every political group in Romania to inform about our policies and to educate political actors about safety measures. TikTok also requested a list of candidates be provided by the Romanian authorities to ensure the GPPPA label could be correctly applied where relevant. 
 
  1. Directing people to trusted sources

(I) Investing in media literacy

We invest in media literacy campaigns as a counter-misinformation strategy. TikTok has partnered with the local NGO Funky Citizens in Romania to help the community safely navigate the platform and protect themselves against potential misinformation during the election. Funky Citizens developed a series of educational videos explaining how users could identify and avoid misinformation, use TikTok’s safety features, and critically evaluate content related to the electoral process. The Romanian community could find the video series with practical advice and useful information about the electoral process on Funky Citizens' official TikTok account and the in-app Election Center dedicated to Romania’s elections. These videos were viewed over 45 million times between March 2024 and February 2025.

  1. External engagement at the national and EU levels

(I) Rapid Response System: external collaboration with COPD Signatories

The COPD Rapid Response System (RRS) was utilised to exchange information among civil society organisations, fact-checkers, and online platforms. TikTok received 57 notifications through the RRS in relation to the Romanian Election, which were addressed and actioned, enforcement included banning or geo-blocking of accounts and content removals for violation of Community Guidelines.


(II) Engagement with local experts

To further promote election integrity, and inform our approach to the Romanian Presidential Election, we organised an Election Speaker Series with Funky Citizens who shared their insights and market expertise with our internal teams. 

(III) Engagement with national authorities pre-election

GR proactively organized an election-dedicated meeting on 7 February 2025 with ANCOM, the Permanent Electoral Authority and Ministry of Research, Innovation and Digitalization to establish points of contact before the elections and to offer access to our reporting tools including the Romanian election center.  On 27 February 2025, we engaged in an online meeting with ANCOM and Autoritatea Electorală Permanentă, the Permanent Electoral Authority in Romania on new Romanian regulation .

On 3 March 2025, we participated in an ANCOM roundtable in Bucharest, as well as a series of meetings including an in-person tabletop exercise on the Romanian election.

In the run up to the 2025 election, and during the election period, we continued to engage with ANCOM and promptly responded to ongoing questions and correspondence 


Mitigations in place
Polish Elections:
(I) Moderation capabilities

We have thousands of trust and safety professionals dedicated to keeping our platform safe. As they usually do, our teams worked alongside technology to ensure that we were consistently enforcing our rules to detect and remove misinformation, covert influence operations, and other content and behaviour that can increase during an election period. In advance of the election we had proactive data monitoring, trend detection and regular monitoring of enriched keywords and accounts.

(II) Mission Control Centre: internal cross-functional collaboration

TikTok established a Mission Control Centre (MCC) in advance of the election, developed risk scenario mapping (covering focused Russian influence operations, AI-generated content (AIGC), misinformation/disinformation, scaled inauthentic behavior, hate speech surges), and implemented regular content trend clustering with rolling containment-correction-and-prevention cycle, covering key features. As a result, all identified threats were contained or mitigated early, with no credible or substantiated election interference claims emerging.

(III) Countering misinformation

Our misinformation moderators receive enhanced training and tools to detect and remove misinformation and other violative content. We also have teams on the ground who partner with experts to ensure local context and nuance is reflected in our approach.

In the weeks leading up to and including the run-off, we removed 530 videos for violating our civic and election integrity policies, and 2,772 videos for violating our misinformation policies.

(IV) Fact-checking

Our global fact-checking programme is a critical part of our layered approach to detecting harmful misinformation in the context of elections. The core objective of the fact-checking program is to leverage the expertise of external fact-checking organisations to help assess the accuracy of potentially harmful claims that are difficult to verify.

Within Europe, we partnered with 12 fact-checking organisations who provide fact-checking coverage in 25 languages (22 official EU languages plus Russian, Ukrainian and Turkish). Demagog, serves as the fact-checking partner for Poland, which provides coverage for the platform.

(V) Deterring covert influence operations

We prohibit covert influence operations and remain constantly vigilant against attempts to use deceptive behaviours and manipulate our platform. We proactively seek and continuously investigate leads for potential influence operations. We're also working with government authorities and encourage them to share any intelligence so that we can work together to ensure election integrity. More detail on our policy against covert influence operations is published on our website.

(VI) Tackling misleading AI-generated content

Creators are required to label any realistic AI-generated content (AIGC) and we have an AI-generated content label to help people do this. TikTok has a ‘Edited Media and AI-Generated Content (AIGC)’ policy, which prohibits AIGC showing fake authoritative sources or crisis events, or falsely showing public figures in certain contexts including being bullied, making an endorsement, or being endorsed.

(VII) Government, Politician, and Political Party Accounts (GPPPAs)

Many political leaders, ministers, and political parties have a presence on TikTok.These politicians and parties play an important role on our platform - we believe that verified accounts belonging to politicians and institutions provide the electorate with another route to access their representatives, and additional trusted voices in the shared fight against misinformation.

We strongly recommend GPPPAs have their accounts verified by TikTok. Verified badges help users make informed choices about the accounts they choose to follow. It is also an easy way for notable figures to let users know they’re seeing authentic content, and it helps to build trust among high-profile accounts and their followers.

Directing people to trusted sources

(I) Investing in media literacy

We invest in media literacy campaigns as a counter-misinformation strategy, working with fact checkers as part of our Election Centre for Poland. TikTok has partnered with Demagog & FakeNews.pl in Poland to help the community safely navigate the platform and protect themselves against potential misinformation during the elections. We also worked with fact checkers to launch an Evergreen Media Literacy Campaign.

External engagement at the national and EU levels

(I) Rapid Response System: external collaboration with COPD Signatories 

The COCD Rapid Response System (RRS) was utilised to  exchange information among civil society organisations, fact-checkers, and online platforms. TikTok received 23 RRS reports through the RRS before the Polish Election, which were rapidly addressed, including NASK and DSA cases. Actions included banning of accounts and content removals for violation of Community Guidelines.

(II) Engagement with local experts

To further promote election integrity, and inform our approach to the Polish Election, we organised an Election Speaker Series with Demagog who shared their insights and market expertise with our internal teams.

German Elections:

Enforcing our policies

(I) Moderation capabilities

We have thousands of trust and safety professionals dedicated to keeping our platform safe. As they usually do, our teams worked alongside technology to ensure that we were consistently enforcing our rules to detect and remove misinformation, covert influence operations, and other content and behaviour that can increase during an election period. In advance of the election, we had proactive data monitoring, trend detection and regular monitoring of enriched keywords and accounts.

(II) Mission Control Centre: internal cross-functional collaboration


On 18 November , ahead of the German election , we established a dedicated Mission Control Centre (MCC) bringing together employees from multiple specialist teams within our safety department. Through the MCC, our teams were able to provide consistent and dedicated coverage of potential election-related issues in the run-up to, and during, the election. 


(III) Countering misinformation


Our misinformation moderators receive enhanced training and tools to detect and remove misinformation and other violative content. We also have teams on the ground who partner with experts to ensure local context and nuance is reflected in our approach.


In January 2025, we removed more than 862,000 pieces of content for violating our Community Guidelines, which includes our policies on civic and election integrity and misinformation.


In the weeks leading up to and including the election, we removed 3,283 videos for violating our civic and election integrity policies, and 12,781 videos for violating our misinformation policies. 


(IV) Fact-checking


Our global fact-checking programme is a critical part of our layered approach to detecting harmful misinformation in the context of elections. The core objective of the fact-checking program is to leverage the expertise of external fact-checking organisations to help assess the accuracy of potentially harmful claims that are difficult to verify.


TikTok collaborates with 12 fact-checking organizations across Europe to evaluate the accuracy of content in most European languages, including German.  Deutsche Presse-Agentur (dpa), serves as the fact-checking partner for Germany, which provides coverage for the platform.


(V) Deterring covert influence operations


We prohibit covert influence operations and remain constantly vigilant against attempts to use deceptive behaviours and manipulate our platform. We proactively seek and continuously investigate leads for potential influence operations. We're also working with government authorities and encourage them to share any intelligence so that we can work together to ensure election integrity. More detail on our policy against covert influence operations is published on our website as well as monthly Covert Influence Operations reports


(VI) Tackling misleading AI-generated content


Creators are required to label any realistic AI-generated content (AIGC) and we have an AI-generated content label to help people do this. TikTok has a ‘Edited Media and AI-Generated Content (AIGC)’ policy, which prohibits AIGC showing fake authoritative sources or crisis events, or falsely showing public figures in certain contexts including being bullied, making an endorsement, or being endorsed.


(VII) Government, Politician, and Political Party Accounts (GPPPAs)


Many political leaders, ministers, and political parties have a presence on TikTok.These politicians and parties play an important role on our platform - we believe that verified accounts belonging to politicians and institutions provide the electorate with another route to access their representatives, and additional trusted voices in the shared fight against misinformation.


We strongly recommend GPPPAs have their accounts verified by TikTok. Verified badges help users make informed choices about the accounts they choose to follow. It is also an easy way for notable figures to let users know they’re seeing authentic content, and it helps to build trust among high-profile accounts and their followers.


Before the German election, we provided all parties represented in federal and state parliaments with written information about our election integrity policies and measures, and offered virtual information sessions for the parties and their candidates. We presented at security-focused webinar for candidates and parties organised by the Federal Office for Information Security (BSI). We also offered all parties represented in federal and state parliaments verification support for their candidates.


Directing people to trusted sources


(I) Investing in media literacy


We invest in media literacy campaigns as a counter-misinformation strategy. From 16 Dec 2024 to 3 Mar 2025, we launched an in-app Election Centre to provide users with up-to-date information about the 2025 German federal election. The centre contained a section about spotting misinformation, which included videos created in partnership with the fact-checking organisation Deutsche Presse-Agentur (dpa). The Election Center was visited more than 5.7 million times.


External engagement at the national and EU levels


(I) Rapid Response System: external collaboration with COPD Signatories 


The COPD Rapid Response System (RRS) was utilised to  exchange information among civil society organisations, fact-checkers, and online platforms. TikTok received 4 RRS reports through the RRS before the German election which were rapidly addressed. Actions included banning of accounts and content removals for violation of Community Guidelines.


(II) Engagement with local experts


To further promote election integrity, and inform our approach to the German Election, we organised an Election Speaker Series with dpa who shared their insights and market expertise with our internal teams


(III) Engagement with national authorities and stakeholders


We participated in the two election roundtables hosted by the Federal Ministry of the Interior (BMI), one before and one after the election.


We participated in the election roundtable as well as the stress test hosted by the Federal Network Agency (BNetzA), the German Digital Service Coordinator (DSC). In addition, we held three separate virtual meetings between TikTok and the BNetzA, also attended by the European Commission, and answered a set of written questions.


We met with the domestic intelligence service (BfV) and the BMI state secretary.


We attended two election-focused virtual meetings with BzKJ (Federal Agency for Child and Youth Protection) and other platforms.


We engaged with the electoral commissioner ("Bundeswahlleiterin") and onboarded them to TikTok. In our election center, we included 2 videos from the electoral commissioner and linked to their website.


We provided all parties represented in federal and state parliaments with information about our election integrity measures and what they/their candidates can and cannot do on the platform in written form and also offered virtual info sessions for the parties and their candidates. We also offered all parties represented in federal and state parliaments verification support for their candidates.


We presented a security-focused webinar for candidates and parties organised by the Federal Office for Information Security (BSI).

Portuguese Elections:

(I) Moderation capabilities


We have thousands of trust and safety professionals dedicated to keeping our platform safe. As they usually do, our teams worked alongside technology to ensure that we were consistently enforcing our rules to detect and remove misinformation, covert influence operations, and other content and behaviour that can increase during an election period. In advance of the election, we had proactive data monitoring, trend detection, and regular monitoring of enriched keywords and accounts.


(II) Mission Control Centre: internal cross-functional collaboration


On 13 May, ahead of the Portuguese election, we established a dedicated Mission Control Centre (MCC) bringing together employees from multiple specialist teams within our safety department. Through the MCC, our teams were able to provide consistent and dedicated coverage of potential election-related issues in the run-up to, and during, the election. .


(III) Countering misinformation

Our misinformation moderators receive enhanced training and tools to detect and remove misinformation and other violative content. We also have teams on the ground who partner with experts to ensure local context and nuance is reflected in our approach.

In the weeks leading up to and including the election (April 21 to May 18), we removed 821 pieces of content for violating our policies on civic and election integrity, misinformation, and AI generated content. In this same period, we removed over 99% of violative misinformation content before it was reported to us.


(IV) Fact-checking


Our global fact-checking programme is a critical part of our layered approach to detecting harmful misinformation in the context of elections. The core objective of the fact-checking program is to leverage the expertise of external fact-checking organisations to help assess the accuracy of potentially harmful claims that are difficult to verify.


TikTok collaborates with 12 fact-checking organizations across Europe to evaluate the accuracy of content in most European languages, including Portuguese. Poligrafo , serves as the fact-checking partner for Portugal, which provides coverage for the platform.


(V) Deterring covert influence operations


We prohibit covert influence operations and remain constantly vigilant against attempts to use deceptive behaviours and manipulate our platform. We proactively seek and continuously investigate leads for potential influence operations. We're also working with government authorities and encourage them to share any intelligence so that we can work together to ensure election integrity. More detail on our policy against covert influence operations is published on our website as well as monthly Covert Influence Operations reports.


(VI) Tackling misleading AI-generated content


Creators are required to label any realistic AI-generated content (AIGC) and we have an AI-generated content label to help people do this. TikTok has a ‘Edited Media and AI-Generated Content (AIGC)’ policy, which prohibits AIGC showing fake authoritative sources or crisis events, or falsely showing public figures in certain contexts including being bullied, making an endorsement, or being endorsed.


(VII) Government, Politician, and Political Party Accounts (GPPPAs)


Many political leaders, ministers, and political parties have a presence on TikTok.These politicians and parties play an important role on our platform - we believe that verified accounts belonging to politicians and institutions provide the electorate with another route to access their representatives, and additional trusted voices in the shared fight against misinformation.


We strongly recommend GPPPAs have their accounts verified by TikTok. Verified badges help users make informed choices about the accounts they choose to follow. It is also an easy way for notable figures to let users know they’re seeing authentic content, and it helps to build trust among high-profile accounts and their followers.


Before the election we met with the main Portuguese regulatory bodies and political parties' Heads of Communication to (i) provide an overview of TikTok's policies for political accounts, (ii) outline TikTok's approach to election integrity and to data security, (iii) encourage account verification and (iv) enable direct contact to respond to their specific requests.


Directing people to trusted sources


(I) Investing in media literacy


We invest in media literacy campaigns as a counter-misinformation strategy. From 18 Apr 2025 to 2 June 2025, we launched an in-app Election Centre to provide users with up-to-date information about the 2025 Portugal election. The centre contained a section about spotting misinformation, which included videos created in partnership with our fact-checking partner Poligrafo. TikTok has partnered with Poligrafo in Portugal to help the community safely navigate the platform and protect themselves against potential misinformation during the elections. Poligrafo developed a series of educational videos explaining how users could identify and avoid misinformation, use TikTok’s safety features, and critically evaluate content related to the electoral process. The Portuguese community could find the video series with practical advice and useful information about the electoral process in the relevant Election Center.


External engagement at the national and EU levels


(I) Rapid Response System: external collaboration with COPD Signatories 


The COCD Rapid Response System (RRS) was utilised to exchange information among civil society organisations, fact-checkers, and online platforms. TikTok received 1 RRS report through the RRS during the Portuguese election, which was  quickly addressed and resulted in the reported content being deemed “FYF Ineligible”.  


(II) Engagement with local experts


To further promote election integrity, and inform our approach to the Portuguese election, we organised an Election Speaker Series with Poligrafo who shared their insights and market expertise with our internal teams.


(III) Engagement with national authorities and stakeholders


Ahead of the election, our Government Relations team represented TikTok at an official meeting organised by ANACOM with the Portuguese Regulatory Authority for the Media (ERC) and the National Election Commission (CNE). The team also met with the Organization for Security and Cooperation in Europe’s Office of Democratic Institutions and Human Rights (OSCE/ODIHR) and in particular, their Election Expert Team (EET) deployed for these elections.

As previously referenced, we also met with Portuguese political parties’ Heads of Communication to (i) provide an overview of TikTok's policies for political accounts, (ii) outline TikTok's approach to election integrity and to data security, (iii) encourage account verification and (iv) enable direct contact to respond to their specific requests.

Romanian Elections:
(I) Moderation capabilities
We supported the Romania 2025 elections by preparing moderators, updating policy, and escalating hate organization content in time for both election rounds. Our teams worked alongside technology to ensure that we consistently enforced our rules to detect and remove misinformation, covert influence operations, and other content and behaviour that can increase during an election period. We continue to prioritize and enhance TikTok's automated moderation technology as such technology enables faster and consistent removal of content that violates our rules. We invest in technologies that improve content understanding and predict potential risks so that we can take action on violative content before it's viewed.
We have thousands of trust and safety professionals dedicated to keeping our platform safe. We have 95 Romanian-speaking moderators, which is the largest such team among digital platforms in the country, both in absolute terms and relative to the number of users. We increased resources on our Romanian elections task force by adding more than 120 subject matter experts across multiple teams including Deceptive Behaviour (which includes Covert Influence Operations analysts), Security and Ads Integrity. 


(II) Mission Control Centre: internal cross-functional collaboration  

In advance of the official campaign period for the Romanian Presidential Election, we established a dedicated Mission Control Centre (MCC), including employees from multiple specialist teams within our safety department. Through the MCC, our teams were able to provide consistent and dedicated coverage of potential election-related issues in the run-up to, and during, the Romanian Presidential Election.

(III) Countering misinformation

Our misinformation moderators receive enhanced training and tools to detect and remove misinformation and other violative content. We also have teams on the ground who partner with experts to ensure local context and nuance are reflected in our approach. We also integrated the most recent insights from our expert partners into our policies and guidelines on misinformation and impersonation. We removed more than 5,500 pieces of election-related content in Romania for violating our policies on misinformation, harassment, and hate speech between March and May 2025. 

(IV) Fact-checking 

Our global fact-checking programme is a critical part of our layered approach to detecting harmful misinformation in the context of elections. The core objective of the fact-checking program is to leverage the expertise of external fact-checking organisations to help assess the accuracy of potentially harmful claims that are difficult to verify. TikTok collaborates with 12 fact-checking organizations across Europe to evaluate the accuracy of content in most European languages, including Romanian. LeadStories, which is a verified member of International Fact-Checking Network and the European Fact-Checking Standards Network,  serves as the fact-checking partner for Romania, which provided coverage for the platform, including across weekends. 

 
Policies and Terms and Conditions
Outline any changes to your policies
N/A
Policy - 50.1.1
N/A
Changes (such as newly introduced policies, edits, adaptation in scope or implementation) - 50.1.2
N/A