Practicing the art of testing

Some testers practice software testing. They think about it and research it. They write about it in papers or blogs. They practice actual testing techniques when they are at the dentist, movies, or airport. They want to achieve a higher level of performance.

In The New Brain, Richard Restak explains how practice affects our minds and what we do:


For the superior performer the goal isn't just repeating the same thing again and again but achieving higher levels of control over every aspect of their performance. That's why they don't find practice boring. Each practice session they are working on doing something better then they did the last time.


As he talks about practice and it's affects on the mind, he points out that modern research is starting to show that the best performers are those that practice, not those with some natural ability that they are born with. As I read this, I thought about all the times I've spoken with James Bach and how he is always practicing the art of testing and critical thinking.

There's another aspect to the superior performer, the ability to not respond on autopilot:


In order to achieve superior performance in a chosen field, the expert must counteract the natural impulse to gain an automated performance as soon as possible.


Because a superior performer is always trying to make something better, they never react with an automated response. If you think about something that appears to be automated like a heuristic response, you might be deceived. Great performers use heuristics to help them quickly generate ideas, but they don't rely on them without examining both the context of the situation and the heuristic itself.

For example, the other day I was practicing performance testing. It seems like a silly thing to do, but for me performance testing is incredibly interesting, I think it's going to be a required skill for testers in the future, and I simply want to develop better heuristic models for when I'm executing my testing. If you look at Scott Barber's patterns for scatter chart analysis, you will see that Scott has put a lot of time in developing heuristics that he can use to assess performance test results quickly.

While practicing with a performance test tool, I was simply running the same trivial script with different runtime settings to see if I could identify patterns between how my results appeared and what settings I used in the tool. If I can identify those patterns, I'm better equipped to debug script problems down the road when my tests are no longer trivial and time is much more important. While practicing, I noticed a couple of patterns and generated a couple of interesting ideas for performance test scenario development, so I sent them to Scott.

Now Scott could have looked at the data I sent him, compared them to his existing heuristic models, and provided me with an automated response as to what I was looking at. He did not. He asked what the script looked like that I was running? What settings was I using in the tool? Did I think about what the tool might be doing in the background? He asked about timing. He used a different heuristic that said, "When I see data, figure out what it is before applying pattern analysis to it."

If he had learned an automated response when he was first starting with performance testing and he had never developed heuristics beyond those initial patterns, he would not today be considered an expert in performance testing. Instead, he would be a really good performance tester, but he would most likely not be an expert and a thought leader in the field. By practicing performance testing, Scott continually tries to achieve higher levels of control over every aspect of his testing.

How do you practice your testing?