Yesterday was the first time I’ve taken part in the Software Testing World Cup (online preliminary competition). This is a global event run by well-known faces as Matt Heusser and Maik Nogens.
For those not familiar with it the general idea is that you form a team of around 4 people, then test a webpage, mobile app or software app for a period of 3 hours. At the end you submit a test report along with the defects you’ve raised. Simple, yet challenging and fun! After a few weeks you find out how you did and if you’re going to the finals. I’ll let you hit the official website to get details of exactly how it all works. http://www.softwaretestingworldcup.com/
I understand that for this 2015 competition there were 70 teams competing, one of which was our Piccadilly Group team of 4. For posterity they were Adam Smith, Mark Crowther, Mahamed Ali and Hadleigh Cox. Sadly, Claire Cox was wrapped up on client work so was ‘replaced’ by Hadleigh. Maybe next year Claire!
Before the event we spent some time working through what our approach might look like. For example, I searched around for mobile emulator solutions and Adam provided a long list of likely heuristics. We grabbed a copy of Explore It! By Elisabeth Hendrickson and talked about where our own strengths lay in terms of testing and technology.
The application under test was a mobile app available at http://www.getroadmap.com/. We armed ourselves with a couple of Android phones, an iPhone and a selection of web browsers for the web page component. As well as these devices to test on, we were provided access to the https://leantesting.com/ defect management tool. The organisers of the Software Testing World Cup also ran a YouTube live stream that we had on a large TV in the room.
Clearly the approach needed in this situation is an Exploratory test approach. After a short look over the application and site we defined the testing areas and drafted some clear charters. These formed the basis for the test activities and discovery of defects. Mahamed and Hadleigh focused on functional testing on the phones, I concentrated on the web site and writing up the Test Report, with Adam doing technically deeper testing looking at performance and security then review of the report.
There were a few minor glitches with the audio and comments on YouTube but these were quickly resolved. Eventually 200+ people were watching the live stream. This was useful as the product owner, Matt and team were on hand to ask question of. Time flew as you’d expect, we raised about 15 defects when we had just over an hour left. These varied from high severity security related defects to lower severity defects related to W3C validation issues or user journey annoyances. Up to the last 30 minutes it was all about raising defects and thankfully the Lean Testing tool was easy to use. In total we got 23 recorded, of which around 10 were Critical and High severity.
The last push was to write up the Test Report. We kept this brief and to the point, being only about 6 pages long. It was emailed out at the last allowed minute. With a confirmatory email coming back our work was done. Now we wait for the results!
It was a good experience and while you could do this type of testing session anyway, the competitive aspect of it made it much more engaging. At the end we were all mentally exhausted! The main thing is it was a fantastic way to get four testers in a room and hone our skill, not only at fast paced exploratory testing, but in working with each other in such situations. Now that’s valuable!