Test Design with Mind Maps
Today's tip is two-fold.

As a first part, it's a great example of a rapid test design practice with XMind mind mapping tool, provided as experience report by Darren McMillan.


  • Mind mapping

    • Increases creativity

    • Reduces test case creation time

    • Increases visibility of the bigger picture

    • Very flexible to changing requirements

    • Can highlight areas of concern (or be marked for a follow up to any questions).



  • Grouping conditions into types of testing

    • Generate much better test conditions

    • Provides more coverage

    • Using templates of testing types makes you at least consider that type of testing, when writing conditions.

    • When re-run these often result in new conditions being added & defects found due to the increased awareness



  • Lean test cases

    • Easy to dump from the map into a test management tool

    • If available the folder hierarchy can become your steps

    • Blend in easily with exploratory testing.  Prevents a script monkey mentality.



  • Much lower cost to generate and maintain, whilst yielding better results.



As a second part, I link you back to 2006, to the article "X Marks the Test Case: Using Mind Maps for Software Design" by Rob Sabourin.


  • Mind Maps to Help Define Equivalence Classes

    • Identify the variables

    • Identify classes based on application logic, input, and memory (AIM)

    • Identify invalid classes



  • Mind Maps to Identify Usage Scenarios

  • Mind Maps to Identify Quality Factors


Go where the website suggests
In the good old days of the Web 1.0 world, when something didn't go well you'd get the basic 404. On a fancy site, that 404 would perhaps be branded. In today's Web 2.0 world, 404 pages are often smart. They try to figure out what you were doing and suggest places where you might go next. It's a cool feature, and a great way to take what might have been a poor user experience and turn it into something positive.

However, it doesn't always work the way we intended. When sites suggest things to you, they can make mistakes. Therefore, I have a heuristic that I follow which says always go where the software suggests I go. Here's an example...

Last week while playing around with the new mint data website, I was greeted with the "City not found" page several times.

mint.notfound


A feature of this page is that it suggests a page for your to visit next. Often, it's a state. However at point point I was able to get the site to suggest I view spending for the entire USA. Using my heuristic, I followed the link and was greeted with an excellent java exception.

mint.exception


Rewarded with this exception, I was able to determine that the site is written using Java (sprint and hibernate), runs on Apache (and the version of Apache). I also have suspicion of what to do next. After seeing this issue I started trying custom URLs and was able to get several different exceptions. Some of which gave me additional insight into the site structure and possible test ideas.

Saying no
Some people struggle with saying no. It happens for a lot of reasons:

  • we want to please others

  • we really do want to do what we're committing to

  • we are afraid to say no

  • etc...


Here are some tips for saying no that I've found helpful:

  • Force prioritization: "If I do that, something else needs to fall off my plate. What would you like that to be?"

  • Remind them about long term effects: "If we write this code without writing our tests according to our methodology, in six months you'll regret it when we need to refactor this section of the code."

  • Clarify that you're being asked to do something you can't: "You recognize that you're asking me to make a commitment that I can't really make, correct?"


However, no matter how you sugar coat it nothing beats a plain-spoken "No, I can't or I won't do that."

Practice it. No is more difficult than yes. Especially if you're saying it to yourself. Trust me when people would rather hear "no" upfront then have you say yes and then they don't get what they wanted.
Test for "ubiquitous" features
This morning I saw mint data for the first time. I love playing with software like this - this is cool stuff. Within a couple of searches (less then five) I found a couple of different issues I'd call bugs. All of them were found using a technique I call testing for "ubiquitous" features.

A ubiquitous feature is one that exists "everywhere." You don't question if it's actually a feature, you just assume it is. An example is using quotes in a search field. After using search engines for the last ten years, you just have some basic assumptions around how search works.

So this morning on mint data, I tested with quotes in my search. Once I performed that search, the interface completely froze up. I had to reload the site to get and new search to work. It was only later that I noticed that this feature is even illustrated in the mint data example search text.

default search criteria for mint dataOther search features caused similar issues. When you think about the application you're testing, it's sometimes useful to understand how users will map their existing expectations onto your application (which they likely don't understand yet) and how that will drive what they will assume it will do.