Before you consider yourself "done" testing, go back and double check
Many teams track release status by tracking the status of tickets in the release. A ticket might be a story, feature, or some other unit used to tie work to a release number. That is, if a release has 10 tickets in it (regardless of if those tickets are bugs, features, or other), the release isn't done until those 10 tickets are done. For most teams, testing is part of the ticket workflow.

When you're working in this type of environment, you often do your test planning for the release upfront. You might charter testing for certain tickets together or break up testing across multiple testers by assigning testers specific tickets. When you do this, you always run the risk of someone adding another ticket into the mix before you've finished your testing. While most teams (hopefully) have some communication methods for this type of change, sometimes tickets sneak in.

Whenever I'm testing a release like this, the last thing I do before I call my testing "done" is to go back and make sure no new tickets were added. Most tools make this very easy to do. All I'm trying to do is reconcile my testing efforts with the release before I hand it off to the next stage. Occasionally, I catch something that got slipped in that my testing missed.
Don't make me think
In the interview linked below, usability expert Tim Altom provides a heuristic for usability testing. When testing he invokes the simple philosophy he picked up from Steve Krug's bestselling book on usability, Don't Make Me Think.

"If anything in the interface will make a user pause and have to think, it should be flagged for possible change," Tim said. "Contrary to popular belief, nobody really wants to think while working with a tool. Thinking is hard work in itself, and should reserved for the job, not for the tools."


More in this interview/article on usability testing.
Cognitive walkthroughs
Cognitive walkthroughs are an inspection technique that can be done using a strong description of the end user and a few use-case style task scenarios. In the article linked below, usability expert Tim Altom provides the following advice for performing a cognitive walkthrough: "The interface is inspected slowly and deliberately. During inspection, the tester answers four questions at each action."

Those questions include:

  • Will the users try to achieve the right effect?

  • Will the user notice that the correct action is available?

  • Will the user associate the correct action with the effect to be achieved?

  • If the correct action is performed, will the user see that progress is being made toward solution of the task?


More in this interview/article on usability testing.
Keeping it personal
People can be messy can't they? They can be unpredictable and can disagree with you and your opinions. Because of this, it's  easy to shy away from communicating directly with others and resort to email and documents to present information, especially if the news is bad. A scenario we testers are often faced with!

But communicating directly with someone either face to face or by phone is a great time saver. I will give you an example.

My current client has a very 'organic' way of developing software. Having no experience in IT development, a lot of the typical methods of communication just don't work. Status reports are left unread on desks, emails and bug reports are ignored. I was finding it hard to get him to understand that the quality of the software was just not there. Until I resorted to 'the face-to-face'.

I called him down to the showroom and showed him all the bugs I had found that morning. He was stunned and amazed to see what issues I was finding.  It helped him understand how far there was to go on the project.

This simple face to face event saved me a lot of time and him a lot of money.

You may not work in such a 'flexible' environment, but its still possible to incorporate verbal communication into your testing tasks.

Here are some suggestions:

  • Talk to your developer before writing up important bugs

  • Show your stakeholder bugs to clarify what type of testing is required (Is this bug important to you?)

  • Show your stakeholder the application status instead of writing up a status report

  • Hold a workshop and discuss the application with developers and shareholders before testing


I'm sure there are lots of other ways. If you can't communicate face to face, phone and even IM are good alternatives.