Logically

Report March 2025

Submitted
Logically is a technology company that delivers powerful Artificial Intelligence and Machine Learning solutions to tackle the toughest modern information challenges. We have developed a suite of products to help government, trust and safety teams, and enterprises gain comprehensive insights into the increasingly complex information landscape. We also operate one of the world’s largest fact-checking teams under our independent subsidiary, Logically Facts, which was created as a separate division in April 2023. 

Our team of award-winning data scientists, engineers, analysts, developers and investigators possess deep domain expertise in the dynamics of misleading content, deceptive behaviour and harmful online narratives. The team is united by the company’s mission to build advanced Artificial Intelligence to give organisations a decisive information advantage in combating threats online. Our OSINT team produces deep-dive investigations and reports on disinformation, including on foreign information manipulation and interference. Logically Facts publishes frequent indepth  fact checks, in addition to more detailed analyses of particular developments. 

Our Signatoryship to the EU Code of Practice on Disinformation allows us to contribute to the development and post-implementation monitoring of industry-drafted self-regulatory standards to fight threats to information integrity. We have opted into Commitments that are geared towards countering the tactics employed by online threat actors, boosting the impact of fact-checking operations and enhancing media literacy. This report will demonstrate how we adhere to those Commitments. 

Commitment 14: This Commitment asks Signatories to outline the policies they have against the tactics, techniques and procedures (TTPs) employed by actors of disinformation. Logically does not provide a user-to-user service where such TTPs could manifest and so we do not have explicit policies against them. However, while we do not conduct policing actions against malicious actors, we do publish fact-checks and OSINT investigations that spotlight any TTPs employed, thereby providing case studies that can feed evidence-based policies by platforms or governments. The investigations we highlight in this report exemplify our identification of TTPs e.g. increased use of messaging platforms to control narratives as well as the amplification of content from fringe websites by hostile state actors. 

Commitment 16: This Commitment asks Signatories to provide qualitative examples of cross-platform migration tactics employed by actors of disinformation to circumvent moderation policies, engage different audiences or coordinate action on platforms with less scrutiny. Logically’s case studies demonstrate the way in which different kinds of disinformation actors migrated to different kinds of platforms depending on the kind of content they were looking to spread. For e.g., while Telegram and Rumble became new homes for far right extremists and Covid-19 conspiracy theories to begin disseminating content, climate misinformation began to be spread from blogging sites like Substack or Medium

Commitment 17: This Commitment asked Signatories to report on the media literacy activities they undertook throughout the reporting period. Logically Facts partnered with TikTok to provide media literacy training which was accessible in several European countries and the UK ahead of elections. It also conducted a number of other media literacy initiatives in Sweden and the UK.

Commitment 29: This Commitment sought for Signatories to detail their methodologies for tracking and analysing influence operations and disinformation campaigns. In response, Logically cited the investigative methodologies employed in specified case studies, and detailed our work on ethical standards and data governance. We intend to keep our research contributions updated in the Transparency Centre, including via annual reporting under the Code.

Commitment 30: This Commitment asked Signatories to report on actions to facilitate fact-checking organisations’ cross-border collaboration. As a company with a dedicated fact-checking subsidiary, we have cited our involvement in the development of the European Fact-Checking Standards Network (EFCSN), our active cooperation with other organisations on specific fact-checks and subject matters, and our internal structural planning to prioritise such collaborations. We also demonstrated this commitment by participating in important events for the fact-checking community in 2024. 

Commitments 31, 34, 35 and 36: These Commitments sought information on how Signatories are contributing to the development of a repository of fact-checking content, as well as the Transparency Centre. Logically intends to contribute to these as and when we are called upon by the Taskforce. 

Commitment 33: In response to this Commitment to uphold ethical and transparency rules, we have cited our accreditation by the International Fact-Checking Network and our application to the EFCSN. We have also outlined our strict ethics and transparency policies, including our lists of prohibited clients and use cases, as well as the ways that we ensure our independence and non-partisanship. 

Commitment 37: This Commitment asked about the Signatories’ engagement with the Taskforce. Logically has remained steadfast in its engagement with the Taskforce. We continue to be part of four Subgroups, namely on the Empowerment of Fact-Checkers, the Integrity of Services, Generative AI, and on Elections. 

Commitment 38: This Commitment called for Signatories to outline the internal teams dedicated to ensuring compliance with the Code. Logically has indicated the titles of the team members responsible for overseeing compliance, as well as the processes carried out. This included internal cross-functional consultations and reviews of internal documentation and policies. 

2024 was a monumental year in Logically’s growth. We have expanded the scope of our threat detection product, Logically Intelligence®. We developed a new tool, Logically Accelerate to assist fact checkers and journalists search and analyse short-form videos. We have carried out a number of media literacy initiatives and have increased our resources dedicated to this activity accordingly. As experts in our field, we can help government, trust and safety teams and enterprises to monitor and mitigate harmful information threats at speed and at scale and empower the public with accurate information to build societal resilience. We intend to continue expanding and refining these efforts in line with our Commitments under the Code in the next scheduled reporting round.



Download PDF

Commitment 16
Relevant Signatories commit to operate channels of exchange between their relevant teams in order to proactively share information about cross-platform influence operations, foreign interference in information space and relevant incidents that emerge on their respective services, with the aim of preventing dissemination and resurgence on other services, in full compliance with privacy legislation and with due consideration for security and human rights risks.
We signed up to the following measures of this commitment
Measure 16.2
In line with this commitment, did you deploy new implementation measures (e.g. changes to your terms of service, new tools, new policies, etc)?
No
If yes, list these implementation measures here
Do you plan to put further implementation measures in place in the next 6 months to substantially improve the maturity of the implementation of this commitment?
No
If yes, which further implementation measures do you plan to put in place in the next 6 months?
Measure 16.2
Relevant Signatories will pay specific attention to and share information on the tactical migration of known actors of misinformation, disinformation and information manipulation across different platforms as a way to circumvent moderation policies, engage different audiences or coordinate action on platforms with less scrutiny and policy bandwidth.
Logically became a member of the Subgroup on the Integrity of Services in September 2023, and has subsequently attended and contributed to several meetings in the regular review of the list of TTPs. 
QRE 16.2.1
As a result of the collaboration and information sharing between them, Relevant Signatories will share qualitative examples and case studies of migration tactics employed and advertised by such actors on their platforms as observed by their moderation team and/or external partners from Academia or fact-checking organisations engaged in such monitoring.
Misinformation actors adapt to moderation by shifting platforms, masking chatter - including the utilisation of written languages foreign to themselves, and exploiting policy gaps. This behavior is highly likely to persist as enforcement evolves and technologies shift. Key examples include:


  • Covid-19 Misinformation. Anti-vaccine groups, facing restrictions on Facebook and YouTube, almost certainly migrated to Telegram and Rumble, using coded language to bypass moderation.


  • QAnon Rebranding. Following their deplatforming in 2020, QAnon affiliates effectively rebranded as child protection advocates and moved to Gab, Parler, and Telegram.

  • 4chan. The forum remains an attractive haven for hostile actors to migrate to once they have engaged with specific online communities. Boards such as /pol/ (politically incorrect) generate and refine extremist, conspiratorial, or harmful narratives, which become "ops" (operations), where users craft misleading or provocative messages, which then become memes, slogans, or fake news in order to make the narrative more shareable on mainstream social media.

  • State-Backed Disinformation. Russian actors highly likely shifted from Facebook to encrypted apps such as Telegram and fringe forums, using AI-generated personas. Cross-pollination has been observed between ok.ru, Telegram, and X.


  • Far-Right Extremism. After January 6th, groups such as the Proud Boys migrated to Rumble and Telegram, likely seeking to exploit weak moderation.


  • Climate Misinformation. Climate denialists likely moved to Substack and Medium, then used mainstream platforms to distribute misleading content.


Recommendations for the UK Government and Relevant Signatories


  • Cross-Partner Intelligence Sharing. UK agencies, tech firms, and fact-checkers should collaborate on tracking migration patterns, and share intelligence on threat actor TTPs in order to identify and disrupt narrative migration.


  • Behavioural Analysis. Encourage platforms to detect patterns of evasion, including coded language, proxy accounts, or inauthentic behaviours.


  • Public Awareness. Publish regular reports on hostile information campaign tactics to improve digital resilience, as well as run media literacy campaigns.


  • Stronger Platform Accountability. Encourage standardised moderation policies and transparent enforcement across traditional and social media platforms.


  • Policy Adjustments. Identify and close regulatory loopholes to ensure coverage of less-regulated platforms, or websites masquerading as local media sources.