Democracy Reporting International

Report March 2026

Submitted

Executive summary 

Democracy Reporting International's (DRI) Digital Democracy Programme Unit monitors threats to information integrity during political events and electoral periods across Europe and beyond. Our Digital Democracy team conducts social media monitoring, audits AI-powered chatbots for their impact on political online discourse and formulates policy recommendations for various stakeholders in the technology and society ecosystem, including lawmakers, tech platforms, and civil society organizations. During 2025 we continue our active work as signatories of the Code, among others, co-chairing the Taskforce on Elections. 

Exposed widespread inauthentic behaviour on TikTok:  DRI continued its research into murky accounts during the reporting period, with a focus on elections in Germany, Poland, and Romania. In 2025, using authenticity indicators and account-level metrics, we identified and reported 482 inauthentic accounts through the Rapid Response System, documenting how these accounts impersonate political actors, amplify partisan messaging, and distort perceptions of political support. Our findings show that murky accounts remain a persistent problem on platforms, particularly during elections, and reinforce the need for stronger detection and enforcement against inauthentic behaviour. 

Advocated for data access for civil society researchers: DRI continued its work on platform transparency and accountability by examining the nature and limits of researcher access to platform data under Article 40 of the DSA. During the reporting period, we undertook litigation action against X, published analyses and opinion pieces on barriers to meaningful access, engaged with more than 88 stakeholders through meetings and webinars to raise awareness, shared practical lessons from using platform tools and pursuing legal remedies. This work aimed to clarify how existing access mechanisms function in practice and to support stronger, more consistent implementation of researcher data access. 

Generated evidence about social media’s impact on elections and political discourse:  DRI carried out social media monitoring across the Austrian, German, Polish, and Sri Lankan elections, identifying toxic narratives, disinformation risks, and harmful online speech affecting both historically marginalised groups and electoral integrity. Using methods such as keyword-based monitoring, sentiment analysis, and computational analysis of online content, we tracked how divisive narratives, discriminatory rhetoric, and polarising campaign strategies spread across platforms during these electoral periods. 

Raised awareness about AI-related election risks and advocated for the transparency of AI-generated content: DRI audited the most popular chatbots during the German federal elections and identified both inaccurate electoral information and unlabelled generative AI content in political communication. DRI also engaged in policy discussions under the EU Code of Practice on Transparency of AI-Generated Content and provided input to the European Commission on transparency, labelling, and accountability standards for generative AI. 

Increased engagement and knowledge exchange on platform transparency and accountability:  Our work during the reporting period remained closely tied to the Code of Conduct framework through DRI’s co-chairing role in the Elections Working Group and our use of the Rapid Response System. We also convened more than six roundtables, webinars, and conferences with researchers, regulators, civil society groups, and other stakeholders on election integrity, platform accountability, and data access. 

Download PDF

Commitment 16
Relevant Signatories commit to operate channels of exchange between their relevant teams in order to proactively share information about cross-platform influence operations, foreign interference in information space and relevant incidents that emerge on their respective services, with the aim of preventing dissemination and resurgence on other services, in full compliance with privacy legislation and with due consideration for security and human rights risks.
We signed up to the following measures of this commitment
Measure 16.1
In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?
If yes, list these implementation measures here
Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?
If yes, which further implementation measures do you plan to put in place in the next 6 months?
Measure 16.1
Relevant Signatories will share relevant information about cross-platform information manipulation, foreign interference in information space and incidents that emerge on their respective services for instance via a dedicated sub-group of the permanent Task-force or via existing fora for exchanging such information.
QRE 16.1.1
Relevant Signatories will disclose the fora they use for information sharing as well as information about learnings derived from this sharing.
Participating in and establishing fora for sharing information on the tools, tactics, and narratives deployed by disinformation actors is a key facet of DRI’s Digital Democracy work. The following is a list of working groups, webinars, conferences, and roundtables attended during the reporting period, with DRI in the role of either organiser or presenter:  
 
  • DRI co-chairs the Elections Working Group under the EU Code of Practice on Disinformation (transitioned to Code of Conduct in February 2025). 
    Since June 2023, DRI has served as co-chair alongside Globsec and TikTok in a multi-stakeholder forum of over 100 civil society and platform representatives. 
  • Elections, Algorithms, and Accountability: Digital Platforms and the 2025 German Federal Elections | 25.02.2025. 
    DRI convened a high-level roundtable in Berlin ahead of the German federal elections to examine how digital platforms shape electoral discourse under the Digital Services Act and AI Act. Sixteen policymakers, regulators, researchers, and civil society representatives discussed research findings and advocacy pathways to strengthen DSA enforcement. 
  • Modelling Researcher Access to Data Legislation Workshop | 13.03.2025. 
    Expert workshop hosted by the Ada Lovelace Institute on legal frameworks for researcher data access across the UK, US, and EU. DRI presented research findings and contributed comparative policy perspectives. 
  • 2025 Milton Wolf Seminar on Media and Diplomacy | 09.04.2025. 
    Vienna-based seminar convening academic and policy experts for in-depth discussions on technology, media, and politics. DRI presented research findings on digital democracy and platform governance. 
  • DSA Circle of Friends | 14.04.2025. 
    Network meeting of the DSA Research Network addressing freedom of expression, supervision independence, and enforcement of the risk-based approach. Discussions informed stakeholder coordination on DSA implementation. 
  • Berlin Independent Tech Researchers' Meetup | 13.05.2025. 
    Research professionals’ meetup on the evolving digital democracy research landscape, with a focus on assessing the effectiveness of platform mitigation measures. Insights informed future research planning. 
  • The DSA in Court: What We Learned from Suing X | 10.07.2025. 
    Following the Berlin Regional Court ruling in DRI’s case against X, DRI and Gesellschaft für Freiheitsrechte co-hosted a public webinar on implications for researcher data access rights under Article 40(12) DSA. The discussion addressed litigation strategies, enforcement pathways, and civil society use of legal data access mechanisms. 
  • Retrospective Insights: Election Monitoring Efforts to Preserve Information Integrity | 04.09.2025. 
    DRI convened a roundtable with 28 civil society and academic participants to assess digital democracy developments since 2023 and review findings from six national and European elections. Insights informed a meta-analysis outlining future research and advocacy priorities. 
  • The Independent Tech Researchers' Summit | 16–17.09.2025. 
    Berlin summit of independent researchers addressing collaboration with platforms, safeguards against researcher retaliation, and strategies for securing data access. DRI shared election monitoring findings and data access challenges. 
  • #InfluencersAgainstDisinfo: Empowering Online Opinion Leaders to Enhance Democratic Resilience | 17–19.09.2025. 
    Berlin event hosted by the Aspen Institute bringing together experts and content creators to address digital communication and disinformation resilience. DRI shared social media monitoring insights and data access concerns. 
  • Data Access Days | 25.09.2025. 
    Convening under the DSA40 Collaboratory focused on implementation of the Delegated Act on Data Access. DRI shared operational experiences with platform data access tools and litigation efforts. 
  • TED Webinar: Safeguarding Democracy and Elections in the Age of AI | 01.10.2025. 
    Online webinar examining AI’s dual impact on democratic processes, electoral integrity, and governance risks. DRI contributed examples of platform accountability work and multi-stakeholder collaboration. 
  • DisinfoCon 2025 | 11–12.11.2025. 
    Organised with the Embassy of Canada to Germany and Alliance4Europe, DisinfoCon brought together researchers, journalists, policymakers, and civil society actors to discuss decentralised social media, AI accountability, and disinformation resilience. The event hosted 65 in-person participants in Berlin and 48 online. 
  • DRI Media Coverage| 2025. Our research and advocacy efforts garnered significant attention, with our reports and analysis being referenced by leading media outlets such as Politico, Euronews, Reuters, CNN and many more. This media coverage furthers the impact of our work, shaping public discourse and informing key stakeholders—including policymakers, civil society, and the broader public—we continue to drive meaningful conversations on critical issues.