Trust over Time

Release
2020
Technologies
AI, Design System, Social Media, Website, XAI
Themes
Information Perception, Media, Security & Trust

What factors induce trust in online content aggregated by AI systems?

The quantity of information and data we deal with on a day to day basis is constantly increasing. But with the multiplication of information sources, automatic aggregation of content, and claims of fake news how can we be confident of its reliability?

The EPFL+ECAL Lab in collaboration with the Idiap Research Institute launched a project to study how to restore trust among online readers.

According to a 2019 Reuters report, search engines, social media and content aggregators are the primary source of information for more than half of the global population. This ever-growing trend increases the risk of spreading fake news and disinformation campaigns.

Recent research in computer science and the evolution of journalistic practices makes it possible to provide indicators on the reliability of content and the origin of its sources. Transparency is also becoming essential in the face of the possible biases of artificial intelligence, used in the aggregation of content. But is this enough to strengthen the credibility of the information with the public? Beyond the reliability of sources, trust takes precedence. The World News Congress has defined it as the number one success factor for media. However, trust is often subjective and develops thanks to the readers’ perception.

Trust Over Time

Fake news must be flagged through a series of design strategies in order for users to regain trust.

Among the first studies on the subject, the work of the EPFL+ECAL Lab has served to identify indicators of confidence for digital media, to imagine ways of designing them visually and to assess their actual impact on potential users. The study was conducted with the Idiap’s Social Computing Group to include Artificial Intelligence, now widely used to combine different sources of content into this work. The collaboration led to the creation of an experimental platform which made it possible to test specific methods for combining content.

Tests conducted with more than 200 people not only revealed key factors for increasing trust in the media, but also generated new knowledge to understand user perception. The results show that the development of algorithms to detect, tag or report fake news is not enough to restore confidence when they are presented in a technical way. Instead, they must be translated through a series of design strategies, such as using them to organize information, or by giving them specific visual forms.

On 3 September 2021, the project Trust Over Time was published as part of the 18th edition of the International Conference dedicated to human-computer interaction INTERACT, in Bari, Italy.

It shows that trust in information, essential to any democracy, requires an approach to the pursuit of innovation that brings together technology, digital design, perceptual science and professional practice.

Direction

Nicolas Henchoz

Project Management

Delphine Ribes

Research Assistant

Hélène Portier

Art Direction

Lara Défayes

Software Engineering

Delphine Ribes, Yves Kalberer

Hardware & Firmware Engineering

Dr Cédric Duchêne

UX Psychology

Dr Andreas Sonderegger