Thumb vote for priority
Sometimes, when reviewing charters with the team, I look to them to help provide some insight into what we should be focused on with our testing. One technique is to walk your list of charters (all twenty or them, or all 200) and have each person provide some insight into where they think it falls in priority. To facilitate this, I sometimes use a thumb vote.

When thumb voting, everyone must vote. There is no sideline. Since I usually use three levels of charter priority (A, B, and C), those map as follows:

  • thumb up = A

  • thumb sideways = B

  • thumb down = C


What I find is that for most charters, most people on the team are on the same page (or really close). Every now and then, you have to make a decision on a close tie (but since you're the test lead, you're use to doing that anyway). Sometimes however, you see some odd votes. Like four up and four down. Those votes normally lead to some really great conversations. Often, people misunderstand the charter or have a different understanding of the risk involved. This surfaces those differences.

It's also fun. (In an I'm-a-tester-so-voting-on-charters-is-fun sort of way.)
Wrapping up a session
When I'm doing a 45 minute test session, I typically find that the last five minutes are reserved for wrap-up. Since I'm normally working with a timer of some sort, I'm normally aware of when I have five minutes to go. Typical activities for wrap-up include:

  • finish what test or activity I'm in the middle of

  • write down what I didn't get to that I think I want to come back to

  • double check to make sure I have enough information for the bugs I need to log

  • double check to make sure I captured all the relevant notes on test data for my session notes (which might mean saving a spreadsheet or something)

  • clean up my test environment (if needed)

  • stop my screen recording and save it off (if running)

  • write down a couple notes about how I feel (did I need something I didn't have, energy level, frustrations, etc...)


It looks like a lot, but most of the time it's about five minutes. If it runs over, that's fine of course. Technically my hands-on testing is done when I'm finished with the first bullet point. But I like to include all the other stuff in my time estimates (aka my session time) because I feel it's all important for the work I'm doing.
Getting a report card
A lot of people struggle with figuring out ways to get better at software testing. An additional challenge is to figure out some way to measure progress. There are a lot of bad metrics out there for measuring tester effectiveness. However, in a recent post on the software-testing list, Cem Kaner had the following insight that I thought was too good not to share:
Figure out what types of tasks you are responsible for. Figure out the attributes of those tasks. For example, what makes a good bug report? What makes a good risk analysis? What makes a good test plan? Develop your lists with colleagues who you respect.

Have a colleague you trust and respect review samples of your work against your criteria. (Do the same for her.) That review is your report card.

This doesn’t provide a tidy number. But if you’re trying for continuous quality improvement, this probably gives you more and better information.
Perfmon for SQL Server Analysis
At the moment I'm using Perfmon to gather information on an SQL server and application I'm testing.

To know what are the best counters to use,  I headed over to this great post by Brent Ozar where he explains in detail what perfmon counters you need and how you setup them up.

Here's what he recommends:

Performance Monitor Counters for SQL Server Analysis

These are listed OBJECT first, then COUNTER

  • Memory - Available MBytes

  • Paging File - % Usage

  • Physical Disk - % Disk Time

  • Physical Disk - Avg. Disk Queue Length

  • Physical Disk - Avg. Disk sec/Read

  • Physical Disk - Avg. Disk sec/Write

  • Physical Disk - Disk Reads/sec

  • Physical Disk - Disk Writes/sec

  • Processor - % Processor Time

  • SQLServer:Buffer Manager - Buffer cache hit ratio

  • SQLServer:Buffer Manager - Page life expectancy

  • SQLServer:General Statistics - User Connections

  • SQLServer:Memory Manager - Memory Grants Pending

  • System - Processor Queue Length


Of course, that's the easy part. The real difficulty is being able to parse the log file and know what to do with the data.

Thankfully, Brent describes how to format the results in excel. He also describes how to analyze the data, starting with the the CPU, then Memory and finally Disk Metrics.

Personally, I'm going to go to my client's DBA and ask them what he thinks of the data, as he knows what I'm testing. It's not in my clients best interest to spend excessive amounts of time analysing this type of data when some-one internally is able to do so far quicker.
Byte conversion
There are a million little applets and forms out there to do byte conversions, but for whatever reason this is the one I have bookmarked. I'm sure full time performance testers can do this stuff in their head, but us part-timers need help every now and then. Perhaps I like it because it's a simple clean interface, and it shows me the conversion rates:

1 Byte = 8 Bit
1 Kilobyte = 1024 Bytes
1 Megabyte = 1048576 Bytes
1 Gigabyte = 1073741824 Bytes

Do you have one you prefer? Or perhaps another similar simple tool you use when testing?