Posts in Performance Testing
Stella is a handy web application testing tool developed by Solutious Inc, a small software company based in Montreal. Stella is a very lightweight Ruby test tool for functional and performance testing. Like JMeter, Stella doesn't simulate a browser; it generates HTTP requests and parses the responses. Currently, Stella provides support for automatic parsing of HTML, XML, XHTML, YAML and JSON.

A few months ago, I wrote an introductory article on Stella for - you can find it here.
Today's tip comes via Tim Koopmans. Tim recently posted on landing page load time and how tools like BrowserMob can help. Based on his post, I went over and took a look at BrowserMob and ran a couple of tests on my personal website. There were a couple of interesting things I found in the free tool they provide:

  • They provide test results from four locations: Washington DC, Dublin, San Fransisco, and Dallas.

  • They provide historical results across test runs: test_history

  • They provide a detailed breakdown of load times by object, by download site: detailedresultsbysite

Based on these detailed results, I was even able to find out that I have a couple of 404's showing up in my current WordPress theme. I rather like the simple interface, and I find tools like this can be quite helpful when taking an initial look at a site's load time and where that time is going.
Tie performance to business goals
Following up on the performance pitch post, here are some tips for helping get your technology team talking in the language of your business team:

  • Take the time to define both top-line and bottom-line application business metrics

  • Work to prioritize that list of metrics to better understand what’s most important (creating tiers can help)

  • Identify what processes and transactions will affect those key metrics

When the team finds a possible performance issue later on, they can then translate what might otherwise be a generic metric (we're X seconds slower on transaction Y) into something that has meaning to the business (given that we're X seconds slower on transaction Y, we expect abandonment to go up Z%).

Defining those metrics, prioritizing them, and tying those to transactions doesn't necessarily need to be complicated. For some applications it will be. But for most applications I've tested, I suspect we could have done this over a couple one-hour workshops using Excel. Don't make it harder than it needs to be.
I'm surprised I haven't posted about soapUI yet given how often I use it. It's one of the first web service testing tools I ever used, and I haven't had much need to look elsewhere. soapUI is great for both functional testing and performance testing. They also have a great service mocking feature.

I've never used the pro version, so I can't speak to those features or the support that comes with it. But I will tell you this, the first (and only) time I had an issue with the tool I contacted the developers and they had a build for me the next day with a fix. I was amazed at the turnaround. I'm sure not all fixes can happen that fast, but it still says something about their passion for the tool and the people that use it.

You can checkout the full feature set here: