Frankenstein is an open source tool for testing swing applications. It's focus is on readable, simple, tests. Some features include: record and playback, test scripts are independent of widget location, OGNL based assertions, and HTML reports.
In small companies especially, software testers get to perform lots of different tasks apart from testing. A software tester doesn’t just know how to use a new product, but invariably knows how to install, configure, work around issues, upgrade and back up data. Software testers will also know about minimum requirements and the applications limitations.
Not surprisingly, a good software tester starts to get a reputation of being a bit of a product expert. Someone people can come to to get product knowledge. Or someone who can setup demonstrations as well as perform them. This is very gratifying, but it can also lead to problems if it starts to impact your own work.
This can lead to stress and can often be a cause of resentment. It's best to try and manage this situations early on. I find the following approaches helpful:
1) Learn to let go
Try not to perform tasks for people, but teach them the basics and let me go and learn it themselves. If lots of people are asking for the same thing, consider writing a short cheat sheet or posting stuff onto wiki.
2) Learn to delegate
Always make sure someone else trains up with you. Don’t be the only person with all the knowledge. Make sure you share it by training up other team members.
3) Learn to say no.
Sometimes its better not to take on work that is inevitably going to compromise your deadlines and deliverables. Say No, explaining why you are unable to help. You may upset people initially, but in the long term you are not doing anyone any favors by taking on too much work. If your decision is overridden you may still have to go and do the extra work, but at least people are aware of the compromises being made.
If you are the only person who can fix things and you are overstretched, you are in fact more of a liability than an asset. Spread your knowledge and let you and your company benefit from your wealth of knowledge.
Not surprisingly, a good software tester starts to get a reputation of being a bit of a product expert. Someone people can come to to get product knowledge. Or someone who can setup demonstrations as well as perform them. This is very gratifying, but it can also lead to problems if it starts to impact your own work.
This can lead to stress and can often be a cause of resentment. It's best to try and manage this situations early on. I find the following approaches helpful:
1) Learn to let go
Try not to perform tasks for people, but teach them the basics and let me go and learn it themselves. If lots of people are asking for the same thing, consider writing a short cheat sheet or posting stuff onto wiki.
2) Learn to delegate
Always make sure someone else trains up with you. Don’t be the only person with all the knowledge. Make sure you share it by training up other team members.
3) Learn to say no.
Sometimes its better not to take on work that is inevitably going to compromise your deadlines and deliverables. Say No, explaining why you are unable to help. You may upset people initially, but in the long term you are not doing anyone any favors by taking on too much work. If your decision is overridden you may still have to go and do the extra work, but at least people are aware of the compromises being made.
If you are the only person who can fix things and you are overstretched, you are in fact more of a liability than an asset. Spread your knowledge and let you and your company benefit from your wealth of knowledge.
When I teach people how to do exploratory testing, a common point of confusion is around what to put in your notes. While I often tell people it depends on you, what you're testing, and the company you're working for - they still want some concrete advice. So I often show some examples from past projects and provide the following template:
For those readers who do session based testing on a regular basis, you'll notice I don't capture some of the classic items like setup time and time spent investigating issues. If you need to capture those metrics (or other metrics your team uses), simply add them in. Over time your session notes morph to become your own and you'll develop a format that works for you.
I'd be interested to see what other people capture.
- Mission: list out what you're testing with this charter
- Environment: list out meta information related to your test environment (versions, configuration, location, etc...)
- Risk: as you test, list out what risks you're looking for while testing
- Coverage: as you test, list out what areas, features, users, data, or other meaningful dimension you're covering while testing -- it's worth noting, I also instruct them to list out what they didn't have time to cover...
- Techniques: as you test, list out what you're doing... what techniques you're using, how you develop tests, etc... (in math class, this would be the "show your work" section of the document)
- Status: as you test, list out questions that occur to you that you need to get answered later, possible issues/problems/bugs you find while testing, notes about automation or future test sessions, etc....
- Obstacles: as you test, list out things that get it your way or ideas you have for things that would make your testing more effective -- this can be tools, hardware, information, training, etc...
For those readers who do session based testing on a regular basis, you'll notice I don't capture some of the classic items like setup time and time spent investigating issues. If you need to capture those metrics (or other metrics your team uses), simply add them in. Over time your session notes morph to become your own and you'll develop a format that works for you.
I'd be interested to see what other people capture.
When doing exploratory testing, I like to have a stopwatch running so I can keep track of where I'm at in my session. One of the most common tools I use for this is XNote Stopwatch. There are some cool features like "Always On Top" and "Transparency" that make is especially useful when testing.Sometimes when I ask someone what their test charter is, I get a paragraph in response. That's not bad, but I find that it often leads to a poorly understood scope definition, which leads to lack of focus while testing, which leads to a charter that runs way too long or feels unproductive. I have a trick I use to help simplify the mission of the charter when this happens.
Try using the following template:
Some examples:
You might then still use the original paragraph to help detail out the charter, but getting a clear and concise mission helps me better estimate how much time I'll need to test, and maintain better focus while testing.
Try using the following template:
"My mission is to test <insert risk here> for <insert coverage here>."
Some examples:
- My mission is to test for various boundary errors for Microsoft Word's bullets and numbering feature.
- My mission is to test for accurate error messaging pop-ups for Ford Motor Vehicle's Build and Price website.
- My mission is to test for SQL injection vulnerabilities for application login and administration screens.
- etc...
You might then still use the original paragraph to help detail out the charter, but getting a clear and concise mission helps me better estimate how much time I'll need to test, and maintain better focus while testing.