A new experimental study conducted by UCL researchers has found that trust/distrust buttons implemented on social media may discourage misinformation and promote more awareness among users regarding the accuracy of the information they post/share.
Professor Tali Sharot, the co-lead author, said that “Over the past few years, the spread of misinformation or fake news skyrocketed, contributing to the polarisation of the political sphere and affecting people’s beliefs on anything from vaccine safety to climate change to tolerance of diversity. Existing ways to combat this, such as flagging inaccurate posts, have had limited impact.“
She went on to explain that the current social media system encourages people to “like” or “share” posts but not appreciate their accuracy.
Moreover, popular posts aren’t appreciated based on their accuracy, not necessarily. Rather, they’re appreciated solely based on the impact they have on the reader. Hence, the “like and share” system.
The professor says that “Here, we have designed a simple way to incentivize trustworthiness, which we found led to a large reduction in the amount of misinformation being shared.“
She’s referring to the fact that false/inaccurate posts have had their reach halved during the experiment which incentivized accuracy.
Professor Tali Sharot and her colleagues also found, in another paper, that social media users were more likely to share content that they’d seen before.
They concluded that the repetition of information bred a sense of trustworthiness in the minds of users, which isn’t hard to see why.
Ingenious Solution: Trust/Distrust Buttons
In their new study, Professor Tarot and her colleagues created a simulated social media platform with 951 study participants. The study was conducted across six separate experiments and looked at how users interacted with posts.
The users would share news articles, and 50% of those were inaccurate to a certain degree. The other users would react to these articles with:
- A “like” reaction
- A “dislike” reaction
- A “trust” reaction
- A “distrust” reaction
- Reposting
They discovered that users would prefer the trust/distrust buttons vs. the like/dislike buttons across all experiments and in almost all cases.
What’s more, this conversation with the trust/distrust buttons produced a visible effect as well. The users who posted the articles started to post less inaccurate articles in order to get “trust” reactions.
The study also performed a computation modeling experiment that revealed an interesting thing. Participants in the study had also begun to become more interested in the reliability/accuracy of a news story before reposting it.
This shows that people actually changed their thinking patterns unknowingly, becoming more skeptical and less gullible.
Lastly, the study also discovered that the participants who were using the trust/distrust buttons got out of the experiment with more accurate beliefs.
One of the co-lead authors, Ph.D. student Laura Globig, had this to say: “Buttons indicating the trustworthiness of information could easily be incorporated into existing social media platforms, and our findings suggest they could be worthwhile to reduce the spread of misinformation without reducing the engagement.“
She also said that “While it’s difficult to predict how this would play out in the real world with a wider range of influences, given the grave risks of online misinformation, this could be a valuable addition to ongoing efforts to combat misinformation.“