SIMODS brings together eight leading fact-checking and research organizations from across Europe to create methodologically rigorous tools to track the prevalence of misinformation on major platforms.
While platforms committed to decrease the spread of misinformation under the Code of Practice on Disinformation, SIMODS is aiming to develop robust Structural Indicators (SIs) to evaluate the prevalence of misinformation on digital platforms and analyze whether accounts spreading misinformation benefit from greater visibility and reach.
These indicators will track multiple aspects of misinformation, including its prevalence, sources, monetization, and cross-platform reach. The study will cover six major social media platforms, including the Very Large Online Platforms (VLOPs) that are signatories of the Code of Practice on Disinformation—such as Facebook, Instagram, YouTube, TikTok, LinkedIn, along with Twitter/X in four European Union languages: French, Polish, Slovak, and Spanish.
The SIMODS project represents a major cross-national effort, bringing together the scientific, fact-checking, and data collection expertise of its consortium members. Leading the project, Science Feedback coordinates the collaboration, joined by fact-checking organizations Newtral (Spain), Demagog.sk (Slovakia), and Pravda (Poland), as well as the AI and Data for Society (AID4So) group from Universitat Oberta de Catalunya, and our company CheckFirst, which provides cutting-edge data collection capabilities.
The results of this project will enable monitoring of key aspects of disinformation on a European scale, providing a solid foundation for decision-making. Additionally, this initiative will assist policymakers, platforms, and other stakeholders in developing strategies to limit the spread of misinformation and safeguard public discourse.
The project is funded by the European Media and Information Fund (EMIF) managed by the Calouste Gulbenkian Foundation.
Check First is a leading Finnish software and methodologies company, spearheading adversarial research techniques. We believe that everyone should be able to understand how and why content is presented to them. We advocate for online clarity and accountability, building solutions to attain this goal. Partnering with leading institutions, regulators, NGOs and educators, we aim at curbing the spread of disinformation and foreign influence manipulations.
Our story