12th June 2021 – (Hong Kong) The anti-extradition protests that began in June 2019 has turned Hong Kong, once proud of the rule of law, into a city of serious disorder and riots.

There were a lot of spreading of anti-government, anti-enforcement, and anti-central government rumours during the period which were subsequently proven to be false including the woman who claimed to have been seriously injured by the police during the conflict in Tsim Sha Tsui in August last year. Her eye was claimed to be almost blinded and she was discovered by local media to have fled Hong Kong for Taiwan in September last year with purported perfect vision.

In addition, many claimed that police ‘killed’ protesters during the MTR Prince Edward Station clash on 31st August 2019 and detainees were raped at San Uk Ling Holding Centre. All these rumours have subsequently been proven to be false as they were created as propaganda or smear campaign to incite hatred towards the police and government.

A construction worker named Poon Yung-wai, 37 was subsequently found by police to have published false news on sexual assault by police on female detainees at San Uk Ling Holding Centre in September 2019. He was subsequently charged to have  incited protesters to besiege the centre by allegedly spreading rumours on social media and pleaded not guilty.

A person named Hon Po Sang was claimed by netizens to have disappeared or died at MTR Prince Edward Station on 31st August 2019, Hon whose real name is Edward Wong Mau Chun suddenly made a video earlier and claimed to have been charged with eight counts of riots, criminal damage, and participation in illegal assembly by the police, but he said that he has since fled overseas. He is reportedly seeking asylum in U.K.

According to article, ‘The Psychology of Fake News’ published by Gordon Pennycook and David Rand, poor truth discernment is linked to a lack of careful reasoning and relevant knowledge, as well as to the use of familiarity and source heuristics. There is also a large disconnect between what people believe and what they will share on social media, and this is largely driven by inattention rather than by purposeful sharing of misinformation. Effective interventions can nudge social media users to think about accuracy, and can leverage crowdsourced veracity ratings to improve social media ranking algorithms. Instead, poor truth discernment is associated with lack of careful reasoning and relevant knowledge, and the use of heuristics such as familiarity.

This dissociation is largely driven by inattention, more so than by purposeful sharing of misinformation. Thus, interventions can successfully nudge social media users to focus more on accuracy. Crowdsourced veracity ratings can also be leveraged to improve social media ranking algorithms.

Misleading anti-government news, as well as yellow journalism, are related forms of problematic news content that are likely sources of political polarisation. Many local Chinese media such as Apple Daily and a particular anti-establishment English media have been leveraged to result in the widespread sharing and infiltration of online fake news.

When considering the factors that may influence what people believe, it is essential to distinguish between two fundamentally different ways to conceptualise belief in true and false news. One common approach is to focus on truth ‘discernment’, or the extent to which misinformation is believed ‘relative’ to accurate content. Discernment, typically calculated as belief in true news minus belief in false news (akin to ‘sensitivity’ or d’ in signal detection theory) captures the ‘overall’ accuracy of one’s beliefs – and thus gives insight into failures to distinguish between true and false content (‘falling for fake news’).

Another approach is to focus on overall belief, or the extent to which news – regardless of its accuracy – is believed (calculated as the average or sum of belief in true news and belief in false news, akin to calculating ‘bias’ in signal detection theory). Critically, factors that alter overall belief need not impact people’s ability to tell truth from falsehood: increasing or decreasing belief in true and false headlines to an equivalent extent has no effect on the overall accuracy of one’s beliefs (i.e., does not affect truth discernment).

A popular narrative is that the failure to discern between true and false news is rooted in political motivations. For example, it has been argued that people are motivated consumers of (mis)information – that they engage in ‘identity-protective cognition’ when faced with politically valenced content, and this leads them to be overly believing of content that is consistent with their anti-government identity and overly skeptical of content that is inconsistent with their anti-government identity. A related theory argues that people place loyalty to their political identities above the truth – and thus fail to discern truth from falsehood in favour of simply believing ideologically concordant information. These accounts contend that a strong causal influence of political motivation on belief is thus the dominant factor explaining why people fall for fake news.

 In other words, true but politically discordant news is typically believed much more than false but politically concordant news – politics does not trump truth. Furthermore, greater overall belief in politically consistent news does not necessarily indicate politically motivated reasoning.

The spread of misinformation online presents both a scientific puzzle and a practical challenge. Failure to differentiate false or misleading news from truth is a symptom of political polarisation in a ‘post-truth’ world, is not an appropriate characterisation. Although people do preferentially believe news that aligns with their politics, this occurs as much or more for true headlines compared with false headlines – and thus people are actually more accurate, not less, when judging headlines that are politically concordant. Rather than being bamboozled by pro-democracy ideology, people often fail to discern truth from fiction because they fail to stop and reflect about the accuracy of what they see on social media.

Accordingly, simple prompts that shift people’s attention to accuracy increase the quality of news that people share on social media. Approaches of this nature, including providing digital literacy tips, are not hindered by the same issues of scalability related to strict fact-checking approaches – and, in fact, can be combined with crowd-sourced fact-checking to maximise efficiency. Human reasoning, when applied appropriately, can be a powerful salve against the lure of misinformation.

Comments