Posts in Bias
Be aware of Bounded Awareness
As it often happens, one idea leads to another...

I was reading about prioritization and decision-making heuristics on Lee Merkhofer Consulting web-site, and this article triggered my interest to find more detailed source about Bounded Awareness. So I found article by Dolly Chugh and Max Bazerman in Rotman magazine (here is the link to PDF, see at page 23).

Here's how it begins.
"Economists and psychologists rely on widely-divergent assumptions about human behaviour in constructing their theories. Economists tend to assume that people are fully rational, while psychologists – particularly behavioural decision researchers – identify the systematic ways in which people depart from rationality. [...] In this article, we propose that awareness can also be bounded, and that this occurs when people fail to see, seek, use or share highly relevant, easily-accessible and readily perceivable information during the decision making process."

Perfectly applies in software development and testing, don't you think?

The authors cover with examples 3 common types of Bounded Awareness.

Inattentional Blindness


"..Information sits visible and available in the visual field, yet escapes awareness when competing with a task requiring
other attentional resources.
This phenomenon, known as ‘inattentional blindness,’ has become an important area of study for cognitive and perceptual psychologists. Its consequences extend to real, life-and-death activities. For example, an airplane pilot who is attending to his controls could overlook the presence of another plane on his runway. Similarly, cell phones can divert drivers’ attention, making inattentional blindness a likely contributor to car accidents." 

Change Blindness


"Change-detection researcher Daniel Simons of Carnegie-Mellon University has demonstrated that people fail to notice changes in the information that is visually available to them. Interestingly, they often cannot describe the change that has taken place, but do demonstrate traces of memory of what they saw before the change.

[...]

The possible influence of change blindness in decision making is evident in a study by Petter Johansson and his colleagues, in which participants were asked to choose the more attractive of two faces displayed on a computer screen. As participants moved thecursor to indicate their choice, a flash on the screen distracted them, and the two pictures were reversed. Nonetheless, most subjects continued to move their cursor in the same direction, selecting the picture they originally viewed as the more attractive.
Importantly, they failed to notice the switch and provided reasons to support their unintended decision."

Focalism and the Focusing Illusion


"'Focalism’ is the common tendency to focus too much on a particular event (the ‘focal event’) and too little on other events that are likely to occur concurrently. Timothy Wilson and Daniel Gilbert of the University of Virginia found that individuals overestimate the degree to which their future thoughts will be occupied by the focal event, as well as the duration of their emotional response to the event.

[..]

Using similar logic, David Schkade of UC San Diego and Nobel Laureate Daniel Kahneman of Princeton defined the
‘focusing illusion’ as the human tendency to make judgments based on attention to only a subset of available information, to overweight that information, and to underweight unattended information.

[..]

The implications of focalism are not limited to laboratory studies. The Challenger space shuttle tragedy, for example, can be better understood through this lens. On January 28, 1986, the Challenger was launched at the lowest temperature in its history, leading to a failure of the ‘O-rings’ and an explosion that killed all seven astronauts aboard. Before the launch, the decision makers at NASA examined seven prior launches in which some sort of O-ring failure occurred. No clear pattern between O-rings and temperature emerged from this data, and the launch continued as
scheduled. Critically, the decision makers failed to consider 17 previous launches in which no O-ring failure occurred. A logistic regression of all 24 launches would have led to an unambiguous conclusion: the Challenger had more than a 99 per cent chance of malfunction."

 

The examples in the article are taken from economics, management, and psychological tests.

I prompt you to share examples from your job in software testing. You can provide them here, in comments, or link to your blog post.
Don't let your distrust of software influence your trust in people
Today's tip is guest written by Zachary Fisher.

Working as we do, it is easy to let skepticism become our default position on any statement made by any person. At any time. On any subject.

This behavior becomes maladaptive when we see ourselves as the only person who can be trusted. It thrusts the onus of proof onto ourselves and causes us to micro-manage every minute detail of our lives and/or project. This dysfunction reveals itself in subtle ways: we keep details from others while trying to figure it out, preferring to create tools rather than relationships, not delegating crucial tasks because someone else won't do it right; basically seeing ourselves as the hub from which anything done right flows.

As a manager of resources, i.e., people, time, money, etc, we should walk in the light of some truths:

(1) We can be wrong sometimes
(2) We will be wrong sometimes
(3) Other people can forgive
(4) Other people can help
(5) Grace abounds in humility

So if the project is taking on gargantuan proportions and you're being consumed by fears of catastrophic failure - take a step back and ask yourself if the task seems so great because you secretly envision having to do it all yourself. If so, kudos for being a responsible adult. Now, take a step of faith and give other people the opportunity to prove themselves as trustworthy as you've become.
Stories about users
Not to be confused with user stories, stories about users can help testers develop an understanding of what users of the software they're testing will value. In a recent IWST, Brett Leonard shared his experience of using stories of users to help develop empathy for when he's doing his exploratory testing:
"Our knowledge of stories of users is key to understanding how people derive value from our applications. [...] Stories of users make up the narrative of an application."

By having (and using) stories of users, Brett is able to better focus on value to the end user when he's doing his testing. Value for Brett, not only means more actionable bugs, but also means more focused test ideas.

During his talk, I was reminded of Atomic Object's announcement of hiring a full time artist to sketch personas.
Use your 0908 card
Today's tip was submitted by Zach Fisher.

Even business owners are subject to inattentional blindness. Their vast experience allows them to perceive certain scenarios as impossible, improbable, etc. Sometimes their rationale is enough to convince me of the impossibility; other times, it is not. It is in these times that I'm compelled to use my '0908' card. What is the '0908' card?

It happened in September 2008, hence the '0908' moniker. Many of us lived blissfully ignorant of the complex financial forces at work around us. Our inattentional blindness was fed by the fruits of happy path living: nothing is going wrong NOW. It is not that we're stupid. We just had no compelling reason to suspect otherwise. It wasn't until activities in disparate areas, leading to a confluence of improbable circumstances, resulting in a global financial meltdown - that it became reasonable to suspect the 'out-of-left-field' scenarios.

You certainly don't want to overplay the '0908' card ( "What if a meteor hits our server farm on Feb. 29, 2012?" ). Nor do you want the conversation to degrade into a puddle of techno-babble. It may be more practical to assert things like, "I know we don't support Linux distros now, but what do you think about Ubuntu's growing market share?" or "Are your friends getting netbooks like mine are?". Enlightening business owners of those disparate dependencies - invisibly churning within their systems - may head off certain disaster downstream.
Endowment effect
The endowment effect is the tendency to place higher value on something you own or that's yours. For example, if someone shows you a new test tool, you might say, “Eh, it’s nice. I might use it.” But if you show off the same tool, you might say, “This tool is the best tool I’ve seen for this type of testing.” Once you own it, it’s value increases. The same is true for test data, processes you help develop, test ideas, etc...