Imagine yourself sitting at a computer...

Recently while working on a problem with James Bach, I was challenged with coming up with a series of tests for a system that could have at least two events that could potentially overlap each other. I came up with the following set of tests:

(Note: The names "process one" and "two" are arbitrary; the processes could in fact be the same process.)


  1. process one, then two

  2. process one and two at the same time

  3. process one, with process two beginning before it's finished, with process one ending before process two

  4. process one, with process two beginning before it's finished, with process two ending before process one





For full disclosure, this isn't the actual set of test cases that I first came up with, there were some duplicates in my initial analysis because I didn't understand the problem and failed to ask clarifying questions. In addition, I had two test cases that didn't really make sense once evaluated.

During the debrief for this exercise, James zeroed in on my test for two processes starting at the same time. His question was, "How would you actually run this test?"

Well, I hadn't really thought of that at the time I was thinking up the tests. When I was thinking of the tests, I was thinking of my model of the problem, not how I would execute them. I responded that I might run that test on a multiprocessor system, or on an emulator, or using some tool that I didn't know about that facilitates this kind of testing.

James didn't let up, "Ok, you have a multiprocessor system, now how will you actually run this test?"

I've tried to block the next fifteen minutes of our conversation from my memory, for the convenient fact that it means I don't have to blog to the world about how dim I can actually be at times. But, I do remember the lesson James taught me. (How convenient!)

When you are thinking about your tests, don't just think about coverage and risk, imagine yourself sitting in front of the computer and actually running the tests.

The problem I faced with James' earlier challenge was that I couldn't actually articulate the details of how I would run that test. How would I actually start both processes at the same time? How would I know they really started at the same time? What does the same time mean (millisecond, clock cycle, other...)? How would I actually monitor the CPUs to see if they were both working at the same time? How would I do this without interfering with the test itself? Grrr... Details!

But the details are important when we design tests. On a recent project I ran into this again. While developing a test strategy for a system with multiple interfaces, some third party web services, and a feature-set under test that at times can't really be tested using the GUI, I ran into the same mistake. I focused too much on the strategy and not enough on the tactics.

I knew what I wanted to test, but I waited too long to figure out how I would test it. It wasn't until prompting from the project manager that I realized this. She asked the question about how I will actually run my tests.

"Uhh... well, we'll just run them. You know, using the system... err... Is that my cell phone?" Needless to say, I wasn't very happy with my answer. Neither was she.

(Note to project managers in the crowd, that's a great question to ask.)

It was at that time we started to develop detailed test methods for our testing (special thanks to fellow blogger John McConda). These were just high level processes for running a basic test through the system, looking at test data, the execution steps, and identifying potential oracles. Nothing super heavy or really strict, but useful in helping us think about actually sitting in front of the computer and running the tests.

After you think of a test (assuming you don't just execute it right then and there), take the next step and think about how you will execute it. What will you need? What will the testing look like? How long will it take? How will you know when you are done?

Imagine yourself sitting at the computer...