TikTok

Report September 2025

Submitted
TikTok’s mission is to inspire creativity and bring joy. With a global community of more than a billion users, it’s natural for people to hold different opinions. That’s why we focus on a shared set of facts when it comes to issues that affect people’s safety. A safe, authentic, and trustworthy experience is essential to achieving our goals. Transparency plays a key role in building that trust, allowing online communities and society to assess how TikTok meets its regulatory obligations. As a signatory to the Code of Conduct on Disinformation (the Code), TikTok is committed to sharing clear insights into the actions we take.

TikTok takes disinformation extremely seriously. We are committed to preventing its spread, promoting authoritative information, and supporting media literacy initiatives that strengthen community resilience.

We prioritise proactive content moderation, with the vast majority of violative content removed before it is viewed or reported. In H1 2025, more than 97% of videos violating our Integrity and Authenticity policies were removed proactively worldwide.

We continue to address emerging behaviours and risks through our Digital Services Act (DSA) compliance programme, which the Code has operated under since July 2025. This includes a range of measures to protect users, detailed on our European Online Safety Hub. Our actions under the Code demonstrate TikTok’s strong commitment to combating disinformation while ensuring transparency and accountability to our community and regulators.

Our full executive summary can be read by downloading our report using the link below.

Download PDF

Commitment 28
COOPERATION WITH RESEARCHERS Relevant Signatories commit to support good faith research into Disinformation that involves their services.
We signed up to the following measures of this commitment
Measure 28.1 Measure 28.2 Measure 28.3 Measure 28.4
In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?
Yes
If yes, list these implementation measures here
  • Supported new independent research through TikTok’s Research Tools (Research API and VCE).
  • Enriched the data available to include more information on stickers and effects (January) and video tags (April) and reached full parity in data available across the API and VCE (May).
  • Added additional functionality to the Research API, including a compliance API (launched in June) that improves the data refresh process for researchers, helping to ensure that efforts to comply with our Terms of Service (ToS) does not impede researchers' ability to efficiently access data from TikTok's Research API.
  • Continued to make the Commercial Content API available in Europe to bring transparency to paid advertising, advertisers and other commercial content on TikTok.
  • Continued to offer our Commercial Content Library, a publicly searchable EU ads database with information about paid ads and ad metadata, such as the advertising creative, dates the ad was active for, the main parameters used for targeting (e.g. age, gender), the number of people who were served the ad.

Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?
N/A
If yes, which further implementation measures do you plan to put in place in the next 6 months?
N/A
Measure 28.1
Relevant Signatories will ensure they have the appropriate human resources in place in order to facilitate research, and should set-up and maintain an open dialogue with researchers to keep track of the types of data that are likely to be in demand for research and to help researchers find relevant contact points in their organisations.
QRE 28.1.1
Relevant Signatories will describe the resources and processes they deploy to facilitate research and engage with the research community, including e.g. dedicated teams, tools, help centres, programs, or events.
TikTok is committed to facilitating research and engaging with the research community.

As set out above, TikTok is committed to facilitating research through our Research Tools, Commercial Content APIs and Commercial Content Library, full details of which are available on our TikTok for Developers and Commercial Content Library websites.

We have many teams and individuals across product, policy, data science, outreach and legal working to facilitate research. We believe transparency and accountability are essential to fostering trust with our community. We are committed to transparency in how we operate, moderate and recommend content, empower users, and secure our platform. That's why we opened our global Transparency and Accountability Centers (TACs) for invited guests to see first-hand our work to protect the safety and security of the TikTok platform..

Our TACs are located in Dublin, Los Angeles, Singapore, and Washington, DC. They provide an opportunity for invited academics, businesses, policymakers, politicians, regulators, researchers and many other expert audiences from Europe and around the world to see first-hand how teams at TikTok go about the critically important work of securing our community's safety, data, and privacy. During the reporting period, DubTAC hosted 24 external tours, welcoming over 180 visitors. Notable attendees included: Ofcom; the EU Commission and representatives from the Irish Parliament; French; Danish; German; and UAE governments. We also welcomed mental health organisations and brand clients, including Coca Cola and Zalando. In March, we launched Mobile TAC in Brussels during Global Marketing Week and delivered 5 Mobile TAC tours across the EU.

We work closely with our ten regional Advisory Councils, including our European Safety Advisory Council and US Content Advisory Council, and our global Youth Advisory Council, which bring together a diverse array of independent experts from academia and civil society as well as youth perspectives. Advisory Council members provide subject matter expertise and advice on issues relating to user safety, content policy, and emerging issues that affect TikTok and our community, including in the development of our AI-generated content label and a recent campaign to raise awareness around AI labeling and potentially misleading AIGC. These councils are an important way to bring outside perspectives into our company and onto our platform.

In addition to these efforts, there are a plethora of ways through which we engage with the research community in the course of our work.

Our Outreach & Partnerships Management (OPM) Team is dedicated to establishing partnerships and regularly engaging with civil society stakeholders and external experts, including the academic and research community, to ensure their perspectives inform our policy creation, feature development, risk mitigation, and safety strategies. For example, we engaged with global experts, including numerous academics in Europe, in the development of our state-affiliated media policy, Election Misinformation policies, and AI-generated content labels. OPM also plays an important role in our efforts to counter misinformation by identifying, onboarding and managing new partners to our fact-checking programme. In the lead-up to certain elections, we invite suitably qualified external local/regional experts, as part of our Election Speaker Series.Sharing their market expertise with our internal teams provides us with insights to better understand areas that could potentially amount to election manipulation, and informs our approach to the upcoming election.

During this reporting period, we ran 7 Election Speaker Series sessions, 3 in EU Member States and 4 in Albania, Belarus, Greenland, and Kosovo. 
  1. Albania: Internews Kosova (Kallxo)
  2. Belarus: Belarusian Investigative Center
  3. Germany: Deutsche Presse-Agentur (dpa)
  4. Greenland: Logically Facts
  5. Kosovo: Internews Kosova (Kallxo)
  6. Poland: Demagog
  7. Portugal: Poligrafo

TikTok teams and personnel also regularly participate in research-focused events. In H1 2025, we presented at the Political Tech Summit in Berlin (January), hosted Research Tools demos in Warsaw (April), presented at GNET Annual Conference (May), hosted Research Tools demos in Prague (June), presented at the Warsaw Women in Tech Summit (June), briefed a small group of Irish academic researchers (June), and attended the ICWSM conference in Copenhagen (June).

At the end of June 2025, we sent a 14 strong delegation to GlobalFact12 in Rio de Janiero, Brazil. TikTok was a top-tier sponsor of GlobalFact. Sponsorship money supports IFCN's work serving the fact-checking community and makes the conference itself possible for fact-checking organizations to attend through providing travel scholarships. The annual conference represents the most important industry event for TikTok's Global Fact-Checking Program and covers a broad set of topics related to mis- and dis-information that are discussed in main stage sessions and break-out rooms. In addition to a breakout session on Footnotes, TikTok hosted a networking event with more than 80 people from our partner organizations, including staff from fact checking partners, media literacy organizations, and TikTok's Safety Advisory Councils. 

As well as opportunities to share context about our approach, research interests, and opportunities to collaborate, these events enable us to learn from the important work being done by the research community on various topics, which include aspects related to harmful misinformation.