Titled Influence by Design: Doppelgänger, Sanctions, Meta, and the Social Design Agency (SDA), the report sheds light on how Meta’s advertising infrastructure was used to run over 8,000 political ads aligned with Russian interests, even after international sanctions targeted the responsible entity: the Social Design Agency (SDA). This highlights Meta’s shortcomings at moderating paid-for content.
The SDA, a Russian IT firm with strong ties to the Kremlin, is at the heart of this investigation. SDA was previously flagged for running the “Doppelgänger” operation, impersonating media brands by creating false news websites to spread disinformation. A new development was enabled by a leak of internal SDA documents first obtained by Delfi Estonia and Süddeutsche Zeitung. We had access to this trove of information and dug deeper. With campaigns targeting France, Germany, Poland, and Italy, the SDA sought to undermine democratic institutions while strategically evading sanctions imposed by the EU, US, and UK.
The report reveals the precision and sophistication of SDA’s methods. Tailored to ongoing political events, their ads employed inflammatory narratives and visually compelling content to maximize reach and impact. Notable examples include rapid-response campaigns designed to exploit incidents like the Hamas-led attack in October 2023, further amplifying divisive narratives.
Notably, the leak reveals that SDA was monitoring closely any press coverage and reports from platforms and researchers. They thoroughly documented how affected countries reported about the operation to assess how effective it is, presumably to demonstrate the effects of their actions to their sponsor.
The findings point to significant gaps in Meta’s compliance with its own content moderation policies and obligations under the EU’s Digital Services Act (DSA). Despite its claims to enforce robust identity verification and transparency measures for political ads, the platform repeatedly failed to block SDA-linked campaigns.
The report estimates that Meta earned approximately $338,000 between August 2023 and October 2024 from ads run by the SDA — even after the agency was sanctioned. This raises serious questions about Meta’s role in enabling state-sponsored influence operations.
The report underscores the urgent need for stronger oversight and enforcement mechanisms. Platforms like Meta must not only enhance compliance with legal frameworks but also ensure that their systems are not weaponised to erode democratic processes. The findings of Influence by Design serve as a stark reminder of the vulnerabilities that persist in online ecosystems. With democracy at stake, it is imperative for governments, civil society, and platforms to work together to close the gaps that allow malign actors to operate unchecked.
Check First is a leading Finnish software and methodologies company, spearheading adversarial research techniques. We believe that everyone should be able to understand how and why content is presented to them. We advocate for online clarity and accountability, building solutions to attain this goal. Partnering with leading institutions, regulators, NGOs and educators, we aim at curbing the spread of disinformation and foreign influence manipulations.
Our story