CEE Digital Democracy Watch

Report March 2026

Submitted

We are a Warsaw-based non-profit championing responsible online discourse & advocating for a democratic future where regulation and free expression go hand in hand.

Crisis and Elections Response

Elections 2025

[Note: Signatories are requested to provide information relevant to their particular response to the threats and challenges they observed on their service(s). They ensure that the information below provides an accurate and complete report of their relevant actions. As operational responses to crisis/election situations can vary from service to service, an absence of information should not be considered a priori a shortfall in the way a particular service has responded. Impact metrics are accurate to the best of signatories’ abilities to measure them].

Threats observed or anticipated

In the run-up to the Polish presidential election, one of the central concerns was the absence of a Digital Services Coordinator (DSC) in Poland. This meant that enforcement of DSA obligations for Very Large Online Platforms (VLOPs) relied solely on the European Commission, without a national authority empowered to monitor platform compliance or provide rapid response during campaigns.

Experiences from previous Polish electoral cycles highlighted the systemic risks linked to online campaigning. Reports from previous campaigns showed use of opaque political advertising, including influencer partnerships and microtargeted content, with limited transparency on spending sources.

Given the proximity to Russian aggression Ukraine, Poland became a target of sustained information operations. EU and NATO threat assessments repeatedly underlined Poland as one of the main theaters of Russian disinformation in Europe.

Concerns about low content moderation capacity in local languages were also a risk factor. While major platforms declare the presence of moderators for Polish-language content, transparency reports and expert analyses reveal that actual human moderation resources remain limited compared to the volume of content produced. 

During our 2025 work, we identified further emerging risks to election integrity, such as the mass use of generative AI or low-quality live broadcasts. 

Mitigations in place

CEE DDW submitted situational reports regarding the lack of transparency in X’s political advertising system, and the insufficient clarity of ad labelling by Meta. Our organisation submitted detailed feedback on two technical documents implementing the Regulation on the Targeting and Transparency of Political Advertising.

Our recommendations focused on ensuring the labels remain unchanged between account verification and ad broadcast, stressed that public labels must consistently present the original verified data and recommended automatic shutdown mechanisms for restricted ads during election periods. 

Our organisation engaged in public advocacy for implementation of Targeting and Transparency of Political Advertising into national legal systems and monitoring of potential infringements of the platform's policies across Central and Eastern Europe region.

CEE DDW also launched the "Like, Share, Vote" policy report focusing on more harmonisation and transparency for political influencers sector. The findings were presented during launch events Warsaw, Brussels and during Media Literacy Expert Group session.

In 2025, CEE DDW has monitored the political advertising spending and labelling on the VLOPs, especially in the Central and Eastern Europe region. 

CEE DDW actively researched emerging risks in relation to digital threats to democracy, such as the use of generative AI or low-quality live broadcasts. Our team actively monitored the behaviour of online actors in the wake of drone incidents across the EU and sabotage attempts. This is reflected in our awareness building work and reporting, such as "The Hidden Power of TikTok Live" focused on monitoring extreme narratives and potential monetisation perpetuated with the use of TikTok Live function.

Crisis 2025

[Note: Signatories are requested to provide information relevant to their particular response to the threats and challenges they observed on their service(s). They ensure that the information below provides an accurate and complete report of their relevant actions. As operational responses to crisis/election situations can vary from service to service, an absence of information should not be considered a priori a shortfall in the way a particular service has responded. Impact metrics are accurate to the best of signatories’ abilities to measure them].

Threats observed or anticipated

Our team actively monitored the behaviour of online actors in the wake of drone incidents across the EU and sabotage attempts.

Mitigations in place