Fact-checking as a conversation: an AI perspective

Misinformation is considered one of the major challenges of our times resulting in numerous efforts against it. Fact-checking, the task of assessing whether a claim is true or false, is considered a key in reducing its impact.

Fact-checking as a conversation: an AI perspective

In the first part of this talk our speaker will present recent and ongoing work on automating this task using natural language processing, moving beyond simply classifying claims as true or false in the following aspects: incorporating tabular information, neurosymbolic inference, and using a search engine as a source of evidence. In the second part of this talk, an alternative approach to combating misinformation will be presented via dialogue agents, and present results on how internet users engage in constructive disagreements and problem-solving deliberation.

Andreas Vlachos is a professor of Natural Language Processing and Machine Learning at the Department of Computer Science and Technology at the University of Cambridge and a Dinesh Dhamija fellow of Fitzwilliam College. Current projects include dialogue modelling, automated fact checking and imitation learning. He has also worked on semantic parsing, natural language generation and summarization, language modelling, information extraction, active learning, clustering and biomedical text mining. His research team is supported by grants from ERC , EPSRC, ESRC , Facebook, Amazon, Google, Huawei, the Alan Turing Institute and the Isaac Newton Trust.

This talk is part of the IET series.

About us

We empower the fight against disinformation. Check First is a software and methodologies company founded by a developer and two journalists. We focus on creating the best tools and practices for the peculiar job of countering and monitoring fakes and influence campaigns, and gain time in the process by fostering collaboration. Our best asset is the complementarity of our founding team, making us approach problems in a comprehensive and innovative way.

Our story