Be aware of Bounded Awareness

As it often happens, one idea leads to another...

I was reading about prioritization and decision-making heuristics on Lee Merkhofer Consulting web-site, and this article triggered my interest to find more detailed source about Bounded Awareness. So I found article by Dolly Chugh and Max Bazerman in Rotman magazine (here is the link to PDF, see at page 23).

Here's how it begins.
"Economists and psychologists rely on widely-divergent assumptions about human behaviour in constructing their theories. Economists tend to assume that people are fully rational, while psychologists – particularly behavioural decision researchers – identify the systematic ways in which people depart from rationality. [...] In this article, we propose that awareness can also be bounded, and that this occurs when people fail to see, seek, use or share highly relevant, easily-accessible and readily perceivable information during the decision making process."

Perfectly applies in software development and testing, don't you think?

The authors cover with examples 3 common types of Bounded Awareness.

Inattentional Blindness


"..Information sits visible and available in the visual field, yet escapes awareness when competing with a task requiring
other attentional resources.
This phenomenon, known as ‘inattentional blindness,’ has become an important area of study for cognitive and perceptual psychologists. Its consequences extend to real, life-and-death activities. For example, an airplane pilot who is attending to his controls could overlook the presence of another plane on his runway. Similarly, cell phones can divert drivers’ attention, making inattentional blindness a likely contributor to car accidents." 

Change Blindness


"Change-detection researcher Daniel Simons of Carnegie-Mellon University has demonstrated that people fail to notice changes in the information that is visually available to them. Interestingly, they often cannot describe the change that has taken place, but do demonstrate traces of memory of what they saw before the change.

[...]

The possible influence of change blindness in decision making is evident in a study by Petter Johansson and his colleagues, in which participants were asked to choose the more attractive of two faces displayed on a computer screen. As participants moved thecursor to indicate their choice, a flash on the screen distracted them, and the two pictures were reversed. Nonetheless, most subjects continued to move their cursor in the same direction, selecting the picture they originally viewed as the more attractive.
Importantly, they failed to notice the switch and provided reasons to support their unintended decision."

Focalism and the Focusing Illusion


"'Focalism’ is the common tendency to focus too much on a particular event (the ‘focal event’) and too little on other events that are likely to occur concurrently. Timothy Wilson and Daniel Gilbert of the University of Virginia found that individuals overestimate the degree to which their future thoughts will be occupied by the focal event, as well as the duration of their emotional response to the event.

[..]

Using similar logic, David Schkade of UC San Diego and Nobel Laureate Daniel Kahneman of Princeton defined the
‘focusing illusion’ as the human tendency to make judgments based on attention to only a subset of available information, to overweight that information, and to underweight unattended information.

[..]

The implications of focalism are not limited to laboratory studies. The Challenger space shuttle tragedy, for example, can be better understood through this lens. On January 28, 1986, the Challenger was launched at the lowest temperature in its history, leading to a failure of the ‘O-rings’ and an explosion that killed all seven astronauts aboard. Before the launch, the decision makers at NASA examined seven prior launches in which some sort of O-ring failure occurred. No clear pattern between O-rings and temperature emerged from this data, and the launch continued as
scheduled. Critically, the decision makers failed to consider 17 previous launches in which no O-ring failure occurred. A logistic regression of all 24 launches would have led to an unambiguous conclusion: the Challenger had more than a 99 per cent chance of malfunction."

 

The examples in the article are taken from economics, management, and psychological tests.

I prompt you to share examples from your job in software testing. You can provide them here, in comments, or link to your blog post.