Report On Availability Heuristics Based On Daniel Kahneman “Thinking Fast And Slow”
Type of paper: Report
Topic: Information, Belief, People, Availability, Psychology, Mind, Validity, Print
Essentially, availability heuristic refers to the tendency of the human mind to take a judgmental shortcut on the basis of immediate examples. In this case, a person could be reviewing a specific topic, person, event (such as a national election) and make conclusions on the basis of the information they can remember rather than what they cannot recall (Kahneman 7).
The ‘availability’ part alludes to the reach-ability to information, often- if not always- in the mind. In other words, person does not necessarily rely on facts unless it is a fact they remember. Otherwise, they go with the information they can readily reach. ‘Heuristics’ allude to the tendency to reduce complex tasks of having to assess probabilities and predict values into a simpler task. However, people also tend to be aware of the consequences of such rash judgment. To avoid such consequences, they employ heuristics strategies. For example, they say, “I think” or “It is likely” or “It may be true”, etc. This means that people, whether consciously or subconsciously, are also aware of the potential invalidity of the ‘available’ information. In an article by Tversky and Kahneman (1124), the two note that the judgments made this way are all based on data that is generally of limited validity that are only processed using heuristic rules. Therefore, they become smart in how they utilize it- cautiously.
Often, this available information tends to be the most recent one (Kahneman 22). To explain this, Tversky and Kahneman (1124) create the analogy of an observable object. The more sharply an object is seen, they say, the closer it seems to be. The terms ‘sharp’ and ‘closer’ here are not just about the physical world, but also bear metaphorical meanings. Between what one remembers and what one sees, the object one sees in the immediate time looks sharper. To carry the metaphor further, a hungry person would rather eat what they have in hand than wait for a remembered delicious hot dog. Bringing this back to this topic, for the mind, the sharpest object (information) is the one it has received most recently, and that which it has received most recently is the closest one it can reach. Riddle (155) finds evidence of this in how redundant TV images influence how people form social reality judgments. Unfortunately, as Tversky and Kahneman (1124) further point out, distances can be overestimated with poor visibility. In other words, the most available information is not in isolation. Rather, its validity relies on support from other information on the same topic. Therefore, to rely solely on the most recent (available) information carries risks of validity because considerations of the visibility (including sharpness) of an object are biased. Biases contribute to illusions of validity (Kahneman 23).
This does not mean that people will pick on the ‘closest’ information they got on an issue even if they do not believe it. There is also the question of effort, which has to do with how hard or easy to reach information. People may only resort to the ‘closest’ and most available information they do not believe in only if it takes much more effort to reach more accurate information.
In other words, in making judgments, people use what they know. However, they shift through files of information and use only that which serves their purpose best. Indeed, there is a lot about availability heuristics that this brief paper does not look into. Still, basically, availability heuristics concerns taking shortcuts to conclusion.
Kahneman, Daniel. Thinking, Fast and Slow, Reprint Edition. New York: Farrar, Straus
and Giroux, 2013. Print.
Riddle, Karen. "Always on My Mind: Exploring How Frequent, Recent, and Vivid
Television Portrayals Are Used in the Formation of Social Reality Judgments". Media Psychology 13 (2010), 155–179. Print.
Tversky, Amos & Kahneman, Daniel. “Judgment under Uncertainty: Heuristics and
Biases.” Science, 185.4157 (1974), 1124-1131. Print.