Web
Analytics
top of page
Writer's pictureJessie G Taft

DL Seminar | The Truth About Fake News: Measuring Vulnerability to Fake News Online

Updated: Apr 25, 2021



By Shuan-Yih (Stanley) Lin and Chin-Yu Chou

Cornell Tech


This reflection was written collectively by Shuan-Yih Lin and Chin-Yu Chou after the Digital Life Initiatives Seminar on October 21st, 2020, where the presenter Joshua A. Tucker talks about the topic of The Truth About Fake News: Measuring Vulnerability to Fake News Online.


After the emergence of social media, various fake news has been shared and delivered through different channels and platforms. Nevertheless, we do not have a sufficient understanding of the methodology to distinguish real and fake news correctly and how well human beings can make the judgment. On October 21st, Joshua A. Tucker, a New York University professor in Politics, Russian and Slavic Studies, Data Science, Social Media, and various areas, introduced studies around human behaviors on distinguishing fake news, and its implications and future developments on crowdsourcing and machine learning models.


Experiment - To understand human behaviors in identifying fake news


Professor Tucker first took us through his experiment which aims to understand human behavior on fake news. There are three main research questions: (1) How often do people believe fake news when encountered in real-time? (2) What individual-level characteristics are associated with susceptibility to believing in fake news? (3) Can certain task flows reduce the odds people believe in fake news? In the experiment, 90 participants are selected by various demographic characteristics, such as gender, age, education level, and political ideology. The participants were asked to evaluate the realness of news from mainstream and low credibility news sources. The responses were collected and analyzed with the ground truth provided by professional fact-checkers from trustworthy media companies.


Human Behaviors - Not good at distinguishing fake news and sharing


Are humans smart enough to make the right decisions on fake and real news? By Professor Tucker’s study, over 60% of ordinary people could correctly identify the real news, but over 33% of people would be deceived by fake news and thinking that they are real. Also, people around three-times willing to share the news they believe are real. That is, fake-news believers would be more likely to share the fake news and thus boost the spread of false information. Therefore, it is crucial for having a thorough research and understanding of fake-news identification.


Evidence Searching - Beware of Misleading “Junks of the Internet”


Traditionally, we think that searching for exterior evidence other than the news itself would be extremely helpful for identifying if the news is real or not. Nevertheless, Professor Tucker’s study suggests that searching from the Internet would increase people’s beliefs in fake news.


Psychologically, people might have the tendency of truth bias, which indicates that we would tend to find evidence selectively to correspond to the material/news we just read. However, Professor Tucker suggests that it is more likely that the Internet is full of Junk, which is corroborating evidence online that can suffice the fake news, and thus enhance the belief in fake news to human beings. That is, searching on the Internet surprisingly does not help us distinguish fake news, and even worsen the situation.


Ideological Congruence in Politics - The Most Important Factor in Fake News

Beliefs


Intuitively, we might think education level or age might be the most influential characteristics of fake-news identification. However, professor Tucker’s studies suggest that ages and education make little difference in the ability to identify fake news. For education level, highly educated people are indeed less likely to identify fake news as real, but it is less than 5% better than those who have only high-school or less education.


For ages, we traditionally think the elder might be more vulnerable to fake news. However, according to professor Tucker’s research, people under 30 are in fact the most likely to be deceived by fake news. Although it is again a less than 5% difference, this is different from what we expected in human’s demographic behaviors in identifying fake news.


Instead, professor Tucker points out that the ideology congruence in politics matters the most for misleading people to believe in fake news: age and education level only have an extremely slight effect on the outcome. Furthermore, around 40% of liberals believe in liberal fake news, while approximately 30% of moderate readers and less than 20% of the conservatives believe them. A similar situation happens to conservative fake news where conservatives are twice more likely to get deceived than the liberals. That is, the ideological congruence in politics of people would strongly affect people’s judgment on fake news.


Machine Learning - A possible way for fact-checking and fake news

detection


Now we know some features like ideological congruence in politics might affect humans in identifying fake news. Professor Tucker and his lab members build various machine learning models, neural networks by adding high political knowledge and cognitive reflection in their models to assist in fake news detection. The model achieved 59% accuracy and can be further improved with more data. Compared to popular models that use the content of the news to do fact-checking and sentiment analysis for training the fake-news detection systems, professor Tucker’s model is less vulnerable to adversarial attacks and considers human behaviors. Therefore, it is a more human oriented model and provides a different direction for developing fake-news detection models. Machine learning models make some hopes in the future of fake news detections and real-time fact-checking, but we need more real-world data for having better performance.


Social Media Data Access Policy and Call for Reformation


Professor Tucker mentioned that social media has become a meaningful method for the news to spread. If we can get more data for research purposes from the tech companies, it would benefit the studies of fake news detection in general. However, we now encounter a dilemma of “sharing more information to researchers” and “protecting the privacy for social media users” under the current privacy policies of the social media companies. Therefore, professor Tucker calls out to reform the policies of data sharing for social media companies. One of the possible ways he mentioned is that sharing the information specifically to certified researchers in the academic area and let the social media users know how their data is used. He also talked about more details about social media and how the policies can be reformed in the last chapter of his book Social Media and Democracy [1].


Conclusion


This presentation gives us a new perspective on what features and behaviors would affect the judgment of fake news. It was quite surprising to me that demographic backgrounds are not the most leading factors to fake-news judging but political ideological congruences.


Also, the way of training the machine learning models in fake news detection is eyeopening to me. Initially, I thought it was hard to identify fake news just by context/words in the news, and it is vulnerable to edge cases, sarcasm, and adversarial attacks, which can simply fraud the fact-checker models by appending sentences from reliable news sources. However, professor Tucker and his team trained the models by experimental results and put high political knowledge and cognitive reflection in them, which uses all the findings to complement the machine learning models.


In conclusion, this presentation gives us more understanding of the factors that influence people in making real-time judgments on fake news, and a possible method of using the derived feature (politics, ideological congruence in this article) to assist in the machine learning model building. As social media is becoming more prevalent these days, we need to take additional notices of these factors and reduce the impact of fake news by having correct concepts on judging them and try to participate in building systematic models to end the spread of fake news.


References


[1] Persily, N., & Tucker, J. (Eds.). (2020). Social Media and Democracy: The State of the Field, Prospects for Reform (SSRC Anxieties of Democracy). Cambridge: Cambridge University Press. doi:10.1017/9781108890960

335 Comments


javehe3053
a day ago

Fishing season is coming up! Love reading through all aspects of it Thanks for keeping us updated, fantastic study!

onde comprar Memopril

Like

Saad Shahid
Saad Shahid
a day ago

"78win has great community features, helping users stay engaged." 78win

Like

Asfahan567 junejo
Asfahan567 junejo
a day ago

This is the reason it really is greater you could important examination before creating. It will be possible to create better write-up like this.

situs bandar slot

Like

Khafiz Irfan
Khafiz Irfan
a day ago

The style absolutely superb. These types of tiny facts usually are fashioned using lots of story practical knowledge. I'd prefer everthing appreciably.

loto188

Like

Rukhsar Rafiq
Rukhsar Rafiq
a day ago

This is a truly good site post. Not too many people would actually, the way you just did. I am really impressed that there is so much information about this subject that have been uncovered and you’ve done your best, with so much class. If wanted to know more about green smoke reviews, than by all means come in and check our stuff.

'Bj88

Like
bottom of page