I used to be quite religious over automated testing, but lately I’ve come to realise that sometimes, I might as well run the app and see what happens. Automated tests can’t cover everything. Here’s a few interesting ‘untestables’ that I found while developing my Tetris-like game, Hellbound, with JBehave’s Story framework:
Abstractions that are divorced from the business value
The Glyph (that’s the shape which falls down the pit in the game) falls faster each time it falls. The ‘heartbeat’ which sparks this accelerates gradually. I could always stub the system clock, but I can’t stub out java.lang.Object.wait(). Having a real heartbeat in the automated tests is a pain anyway, because it would take ages for the tests to run.
Dave suggests making a Waiter and functionally verifying it just the once, but it becomes abstracted then from the value I want – which is that the game should become steadily more challenging. The only way to find out whether that behaviour is delivered is to play.
Code that gets collected by the garbage man
After between 5 and 50 glyphs, depending on how fast I was playing, the Glyphs stopped responding to the heartbeat because the garbage man took my listener away. Nice. Since my automated tests are mostly CPU-bound, I think they might take even more glyphs to come across that bug. Maybe it explains the very occasional intermittent failure which I’ve encountered.
Apps that keep going without you
I could only see it with the logging turned on, but the game was still running after I’d closed the frame. I could see the logs of every glyph being created, falling down the (now non-existent) pit, with each fall getting shorter and shorter until eventually the game was over. I had to tie the heartbeat thread to the closing of the window. Maybe this comes under scenarios nobody thought of.
Tests that mysteriously disappear from the build
The only way I’ve found to catch this is to occasionally break an obscure test and check that the build is still broken. Fortunately I only have to remember to do this when I’m in particularly fine coding form and not doing it by accident anyway.
Scenarios nobody thought of
My game works! Left, right, rotate, drop, move down, heartbeat. Of course, you can move the glyph left ten times and straight off the edge of the pit, but we didn’t think of that.
Manually testing the code you’ve written isn’t just important, it’s also quite fun (especially if you’re writing Tetris). I get a kick out of seeing the value that I’ve been working towards appear; I’m sure you do too.
However, for every bug I’ve mentioned here, there have been 50 that my automated tests have picked up and warned me about; plus they allowed me to eliminate a dozen potential culprits when pinning down the heartbeat bug. Logging did the rest.
Don’t think of automated testing. Think of the value you’re trying to get from your system, and how to find out if you’ve got it. This saves wasting time on untestables, and (for me) creates cleaner, more extensible code, because I’m thinking of the domain while I write it.
Of course, I don’t think of them as tests, but ‘executable scenarios’. JBehave has come on a long way in the last year and we’re hoping for a 0.9 pre-release soon (when I’ve worked out how to rebuild the website and added some decent examples), followed by a 1.0 release as soon as feedback suggests it’s ready. I’ve used BDD with JUnit on a few projects now with success, but only at a unit level. I’ve shown a couple of customer-proxies (BAs) the stories we’re running in Hellbound and they were quite excited at the idea of helping to write and being able to read the code. So am I. Watch this space.