By ANTONIA JÜLICH and CHRISTOPH WINTER 12/02/2015
Twelve people were killed in a terrorist attack because of religious beliefs. Images of a horrendous bloodbath. Relatives grieve for their losses. Most probably you associate these images with the Charlie Hebdo shootings in Paris. Yet, the probability that these attacks took place in Baga, Nigeria, where supposedly 2,000 people have been killed during the same week of the attacks in Paris, is much higher.
If you were asked about the place of a terrorist attack a month ago, it seems likely that, initially, you would have thought of Sydney, where two people were killed in a terrorist attack. However, on the very same day, the 16 December, 2014, 132 children were killed in a school in Peshawar, Pakistan. Thus, the more reasonable guess would have been Peshawar.
From time to time, human beings appear to make irrational choices or make unreasonable judgements. Does this mean that the conception of homo economicus, the “resourceful, evaluating and maximizing man,” (Coleman 1990, p.5), a human being who makes rational choices based on careful evaluation of facts and evidence, is outdated? Why did we think of Paris and Sydney rather than Baga and Peshawar?
The availability heuristic implies that events which can be recalled without difficulty is considered not only as more important but also as more likely to occur than those which cannot be recalled easily (Esgate and Groome, 2005). Since the most recent events – the terrorist attack on Charlie Hebdo – can be recalled rather effortlessly, individuals overestimate the probability of a similar or related incident reoccurring. In this regard, the role of the media is crucial. By giving disproportionately high attention to certain topics and happenings, the implications of such events will be overrated by individuals. However, it should also be borne in mind that media companies are often privately funded and profit-oriented. Since individuals are attracted to sensational news, which incorporate an element of suspense, the media focuses more on unexpected events such as shark attacks, airplane accidents, homicides and a terrorist attack in a Western country in order to increase circulation.
For this reason, individuals believe that the likelihood of being killed by a car accident is higher than dying from stomach cancer, although, in fact, it is the other way around (Read, 1995). In other words, the more people are interested in a specific incident, the more the media reports on it and vice versa, creating a vicious circle. Due to the availability heuristic, the terrorist attacks in Paris may have led to an overestimation of the likelihood of future terrorist attacks in Western countries and to the perception of excessive security threats. The question arising is, why are individuals biased?
Pertaining to Kahneman, human reasoning falls into one of two categories: ‘System 1’ and ‘System 2’. While System 1 thinking is fast, automatic, frequent, emotional, stereotypic and subconscious, System 2 thinking is slow, effortful, infrequent, logical, calculating and conscious (Kahneman, 2011). The vast majority of all decisions individuals make – approximately 95 percent – are based on System 1 thinking (ibid). It is not only fast, but the decisions are also mostly correct in terms of expected value making it a very efficient type of thinking. Case in point, a football player does not thoroughly analyse whether it may be better to pass the ball to the right or the left wing. He just does it. An employee does not reconsider everyday whether it may be better to take the bike or the train to go to work. He just takes the one he always uses although parameters such as the weather or the train schedule may have changed.
An individual relying mostly on System 1 is truly better off than one who deeply analyses every single decision by using System 2. Hence, only for important decisions, i.e. for those which may involve heavy consequences, one shall use the latter one. Although most of the decisions made by System 1 – also referred to as intuition (Kahneman, 2011) – are correct, the system will make mistakes at times. The tendency to overestimate the chance of a terrorist attack in Europe may be seen as a result of System 1 thinking. Instead of an in-depth analysis trying to assess the complexity of marginalised groups, domestic security and the geopolitical status quo, one relies on System 1 thereby labouring under the availability bias. Besides the availability heuristic, there are many other types of biases such as the anchoring bias, the endowment effect, loss aversion or the confirmation bias which lead us to make suboptimal decisions. Human behavior is not perfect in terms of maximizing utility due to the limited capacity of the mind, yet, whether those decisions are irrational depends on how the notion of irrationality is interpreted.
If one only refers to a single decision, then such a decision may be regarded as irrational and human beings may act irrationally. However, if one draws the bigger picture keeping in mind the efficiency of System 1, a single false choice may be the consequence of initially deciding to use System 1 pertaining to decisions which do not involve severe implications. Such a decision may be based on the knowledge that System 1 will make mistakes but the advantages arising from the speed of this type of thinking prevail. Accordingly, occasional false decisions may be within the ambit of a long-term strategic and fully rational choice – a choice made by homo economicus.
Coleman, J 1990, Foundations of Social Theory, Harvard University Press, Cambridge.
Esgate, A & Groome, D 2005, An Introduction to Applied Cognitive Psychology, Psychology Press, Hove, New York.
Kahnemann, D 2011, Thinking fast and slow, Allen Lane, London.
Read, JD 1995, ‘The availability heuristic in person identification: The sometimes misleading
consequences of enhanced contextual information’, Applied Cognitive Psychology, vol. 9, pp. 91–121.