Smart Contracts

Updating Solidity code and Testing a Smart Contract

Books on the Blockchain

Publica Self Publishing

Goodbye Contracting

Hello brave new old world...

Ruby-Selenium Webdriver

In under 10 Minutes

%w or %W? Secrets revealed!

Delimited Input discussed in depth.

Tuesday, 10 November 2015

Software Testing World Cup 2015 - Report

Yesterday was the first time I’ve taken part in the Software Testing World Cup (online preliminary competition). This is a global event run by well-known faces as Matt Heusser and Maik Nogens.

For those not familiar with it the general idea is that you form a team of around 4 people, then test a webpage, mobile app or software app for a period of 3 hours. At the end you submit a test report along with the defects you’ve raised. Simple, yet challenging and fun! After a few weeks you find out how you did and if you’re going to the finals. I’ll let you hit the official website to get details of exactly how it all works.

I understand that for this 2015 competition there were 70 teams competing, one of which was our Piccadilly Group team of 4. For posterity they were Adam Smith, Mark Crowther, Mahamed Ali and Hadleigh Cox. Sadly, Claire Cox was wrapped up on client work so was ‘replaced’ by Hadleigh. Maybe next year Claire!

Before the event we spent some time working through what our approach might look like. For example, I searched around for mobile emulator solutions and Adam provided a long list of likely heuristics. We grabbed a copy of Explore It! By Elisabeth Hendrickson and talked about where our own strengths lay in terms of testing and technology.

The application under test was a mobile app available at We armed ourselves with a couple of Android phones, an iPhone and a selection of web browsers for the web page component. As well as these devices to test on, we were provided access to the defect management tool. The organisers of the Software Testing World Cup also ran a YouTube live stream that we had on a large TV in the room.

Clearly the approach needed in this situation is an Exploratory test approach. After a short look over the application and site we defined the testing areas and drafted some clear charters. These formed the basis for the test activities and discovery of defects. Mahamed and Hadleigh focused on functional testing on the phones, I concentrated on the web site and writing up the Test Report, with Adam doing technically deeper testing looking at performance and security then review of the report.

There were a few minor glitches with the audio and comments on YouTube but these were quickly resolved. Eventually 200+ people were watching the live stream. This was useful as the product owner, Matt and team were on hand to ask question of. Time flew as you’d expect, we raised about 15 defects when we had just over an hour left. These varied from high severity security related defects to lower severity defects related to W3C validation issues or user journey annoyances. Up to the last 30 minutes it was all about raising defects and thankfully the Lean Testing tool was easy to use. In total we got 23 recorded, of which around 10 were Critical and High severity.

The last push was to write up the Test Report. We kept this brief and to the point, being only about 6 pages long. It was emailed out at the last allowed minute. With a confirmatory email coming back our work was done. Now we wait for the results!

Closing thoughts
It was a good experience and while you could do this type of testing session anyway, the competitive aspect of it made it much more engaging. At the end we were all mentally exhausted! The main thing is it was a fantastic way to get four testers in a room and hone our skill, not only at fast paced exploratory testing, but in working with each other in such situations. Now that’s valuable!


Tuesday, 6 October 2015

London Test Forum - Be there Tue 24th NOVEMBER!

Hey All,
I caught up with Stacey Howard over at the Reco Group the other day. As well as letting me know about the incredible roles and clients they have, she also told me about the awesome free event they're arranging.
Have a look below and be sure to attend. Be sure to tell you friends and colleagues too!
I will definitely be there to see Rob Lambert, Jonathon Wright and Declan O'Riodan.
From the Eventbrite website
The London Test Forum aims to create a platform for test professionals to share ideas and thoughts on a rapidly growing and changing space. Please join us on Tuesday 24th November 2015 at the Leathermarket to discuss the future of testing.
We have industry recognised speakers such as Jonathon Wright talking about TestOps. How Agile and DevOps has exacerbated the problem for security test resources with Declan O'Riodan, and finally 10 behaviours an effective employee should show with Rob Lambert.  
  • 5:30pm - Doors open: Drink and Networking
  • 6:30pm - First speaker: Jonathon Wright: The Digital Evolution: TestOps Blueprint
  • 6:45pm - Second Speaker: Declan O'Riordan: Application Security in an Agile or DevOps environment
  • 7:00pm - Third Speaker: Robert Lambert: 10 Behaviours of Effective Employees
  • 7:15pm - Networking and Drinks
... read more on the site and get tickets

Wednesday, 29 April 2015

JavaScript Learning - Resource List

I've had this list of sites sat on my system for some time. As I often get asked about good sites to hit to learn either of Ruby or JavaScript I thought I'd share this list here. Feel free to suggest additions and I'll update the list. When I get 5 minutes I'll add a permalink to my website too.

A single resource will rarely teach you all you need to know or explain it in just the right way for your learning style or current understanding. Hack away at one of the sites then switch to another to both re-learn what you've been covering then learn new things.

Remember: study for 40 minutes per day for 30 days.

Tools to Code with

Online Reference Books

JavaScript Tutorials

JavaScript Components

JavaScript Frameworks / Libraries


JavaScript Browser Test Automation

JavaScript Game Engine / Framework 

Game Making Tutorials

Wednesday, 8 April 2015

Not knowing - in interviews and meetings

You can’t be anywhere else, right?

I may be a little odd (yeh, you knew that already…), but I like interviews. I like them whether it’s an interview for a job, engagement or trying to land a new client. I like them, because they are exercises in thinking how I can solve business problems with technology and realizing what I don’t know.

Some people get worried about this ‘lack’. I understand why and know it’s not often spoken about. So, let’s talk about it.

You can’t know everything
So often, junior members of staff get worried that they won’t know something they’re asked about.

Firstly, there will ALWAYS be something you don’t know, of or about. Just today a colleague who is a member of the CIPD advised that some interview questions can follow the STAR^ approach. “The what?” was my reply. Yup, 43yrd old, nearly 20 years experience, no idea what she meant. It will always be the way, relax with it. In fact, revel in it. It’s what makes our profession so exciting to be in.
^(Situation, Task, Action, Result)

The problem we have is twofold;

1) the field we are in is extensive and covers business and technology, so being expert and encyclopedically knowledgeable on everything is impossible and;

2) there is just too much depth in these areas, i.e. too many technologies for you to ever learn. Do you know Java, JavaScript, C#, Ruby, C++ in equal measure? Didn’t think so. How about WebSphere, Rails, Databases, ERMs, Performance or Security testing, in equal measure? What about the business side such as UX design, Business Analysis, Project Management, Service Desk? Qualified and experienced in all fields? Exactly.

You can’t know everything about everything and that’s OK.

You only know what you know
You might be a fresher with zero to a few years of experience or a gnarly consultant like myself who’s next milestone is his second decade in testing. You might be somewhere in between. Here’s the news, you can’t be anywhere else, personally or professionally, than where you are now. You just couldn’t have gained more experience, studied more or been taught more. Again, that’s OK.

I know you want to be more professional, more technical, more knowledgeable, more known and so on; hey, me too. Don’t get overly stressed about where you are now, here is the only place you can be.

It’s OK, so long as…
However, there are a couple of catches with this “It’s OK” business, you need to know how you’re going to respond to this shortfall in your situation compared to where you’d like to be and if here is where you should be.

When you don’t know, when you can’t answer, when the way forward isn’t clear to you – what are you going to do? It’s critical that you know how to respond when you get stuck. Stuck by a client’s need, stuck by and interviewer’s question.

The simplest response is to honestly say you don’t know, even if you think that’s going to hurt their opinion of you or your organization. You have no other choice, you CANNOT lie. However, you and be ambiguous and indirect, that’s different. No one expects you to be ‘promoting’ what you don’t know, there’s no need to be broadcasting it.

Often, we don’t know things at different levels. I covered a model we could use in my other post Why you’re not a Software Testing Expert. The art is in providing context as to why you don’t know something. For example, I’m more than happy loading data into and pulling data from an SQL database. I can even set up basic tables, keys, etc. However, I cannot tune a database, make it resilient or secure. If asked why, the answer would be; the admin side is something I’ve simply not done or as I focus on testing I work with these systems as provided.

That’s fine. I’ve said I don’t know, but I qualified it with what I do know and why what I don’t know is actually not that important. Hopefully. If it is, then I’m not the right person for the role. If this is a new client I’m trying to land then the answer is different. Perhaps; I’m not 100% clear on that, but I know a couple of people back at base that will be, let me raise it with them and get back to you. Here the point is that as an organization, we know everything so me not knowing is not a problem. Assuming there really is someone back at base, else given we can’t lie we might not be the best company to delivery against the client’s needs.

Do you know how you will respond to not knowing? Respond without getting flustered, respond in a way that makes you feel excited you just discovered a new area to study?

Should you be here?
We waste a lot of time and energy not focusing on what’s important, then regretting it later. A question to ask is if that’s what you do. When the client or interviewer catches you out, should you have known the answer, possessed the knowledge? If so and if it’s because you’ve been intellectually lazy or undisciplined then shame on you. You let yourself down and everyone who loves and possibly relies on you, seriously. But hey, at least you know it now.

How you develop yourself in what areas is a topic for another post and really something for you to decide. Just realise that one day you may be in front of a client or potential employer and get asked something you know you could know; if only you would make the effort. Given that is so, go put in the effort! On the main website, check out the Professional Development Plan template under Team Pack:

Closing Thoughts
You can only be where you are now, so don’t beat yourself up but excitedly look forward to who you will be months or years from now. Get on with clarifying who that future you is and working backwards, decide how you’re going to transform yourself.

Where do you want to be? How are you going to get there?

Not knowing is OK, just decide how you’ll respond and what the path to ‘there’ is, then get moving.


Enjoyed this post?
Say thanks by sharing, clicking an advert or checking out the links above!
Costs you nothing, means a lot.

Tuesday, 7 April 2015

A Brave New (sub) Reddit (Software Testing Views)

There are a number of sub-Reddit's on Reddit already, but none that I felt quite focused on what I felt were hitting the Reddit nail on the head. I wanted a place to go where a curated set of links were provided, that had a focus.

Go to; Software Testing Views  (

Hit the link and save it down as a Favorite: [ctrl] + [d] generally does it.

Why a new sub Reddit?
Software Testing Views is about just that, not a shopping list / garage sale / stack of links, but a set focused on views. As stated on the page:

Yes to:
Blog Posts, Forum Discussions (opinions), White Papers, Case Studies, OpEds, Discussion 

Documents, Conference Reports, Research Findings, Debates, Arguments and Opinions.

No to:
Questions, Adverts, List Links, Non-direct links, Walled content, Content that requires sign-up / log-in

The idea is that this is a collection of links to learning resources and strong arguments. In this way, you can visit any of them and LEARN and be CHALLENGED.

Have a look at the Wiki page for more info.

Be sure to add you links too and drop by often!
Software Testing Views  (


Enjoyed this post?
Say thanks by sharing, clicking an advert or checking out the links above!
Costs you nothing, means a lot.

Wednesday, 1 April 2015

EU-US agreement BANS the job title 'software tester'

It's now law - being called a Software Tester is illegal

In a shock joint move, the US Department of Commerce and the European Union Department for Trade – have BANNED the use of Software Tester as an official job title.

Negotiations have been going on for some time into the harmonisation of job descriptions, titles and pay across the technology field. In what appeared to be no more than a Think Tank coming up with yet more bureaucratic nonsense, it follows hot on the heels of ISO 29119.

“We should have seen this coming. Unlike 29119, that we can ignore, this is law” said a source in the IT recruitment industry that wished to remain anonymous.

I called Johan Steiggler, Head of Employment Harmonisation at the EU and asked him why ‘software tester’ was not on the list, “I recall we spoke to a big consultancy about this. Our research told us there were many names in use and the most harmonised approach would be to use the title ‘QA Test Engineer’, so that’s what we’ve added.”

When challenged about using ‘software tester’ his response was terse, “you could do, but that would breach employment law”.

So there you have it, software tester is dead.

Read the announcement on the EU website.


Enjoyed this post?
Say thanks by sharing, clicking an advert or checking out the links above!
Costs you nothing, means a lot.

Tuesday, 31 March 2015

Peak-End or How to deliver in then leave a testing role.

The curious nature of our monkey brains never ceases to amaze me. One model for our thinking that caught my attention a few years ago was an idea called Peak-End. It’s the simple idea that we remember only the peak of any experience and the end. There is a complication here, what we remember is the memory of the event, not the experience itself. There’s been a lot said about the remembering self and the experiencing self, I’ll let you Google it.

The significance here is that your employer is also an owner of a monkey brain which possesses a remembering and experiencing self. You might want to keep that concept to yourself though.

We’d like to think that when we leave a role, the employer or client will remember the experience of us working for them. We hope they’ll be mindful of the many days we delivered consistently, provided all those reports on time, finished the testing on time every time, worked late, etc. I’m here to tell you they won’t.

What they will remember is the big event, the highpoint or possibly the low point. Think back to your prior employments and contract engagements, what do you remember? I remember big launches, major defects, moments when unexpected change happened. We do this because everything else, all those day to day things that were just getting-the-job-done were just like any other day and nothing to really remember. They were not a significant experience to remember.

That will happen, the core to a good delivery is the consistent and steady achievement of the task you’ve been set. If we do nothing else, we need to achieve what we were asked to do.

However, we’re looking to be remembered, right? To do that we need to work the Peak-End rule a little.

Do one thing amazing, then celebrate success
During a project or contract engagement work hard to do at least one thing really, really, really well. Do it so well that it re-shapes your thinking about how that one thing is done. Push yourself, put the hours in, do the research, deliver something no one will forget. Do this and you’ll create a peak in the experience that will form part of the ‘remembering’ later on.
This could be anything, finding an show stopping defect, finishing all the tests when it looked impossible, fixes a deployment issues and safeguarding the release. The best tactic is just to keep doing the best you can, keep pushing for excellence and you will hit on that one amazing thing.

Next, celebrate success. It’s no good doing awesome if no one recognises it for that it is. SPINACH – Self Promotion is Not a Crime Here :) If you’re a manager call out the successes of your team, if you’re a team member make your great work known. Maybe that’s to the team in stand-ups, weekly reports or conversations with your manager.

End on a Crescendo
Eventually, we all leave jobs or engagements. If you’re getting pushed off a role or leaving on your own, treat it the same and push yourself to do good work even more.

As we know about Peak-End, we know it’s critical to end well. You must end on a high, even if that’s just a really solid wrap up and handover. How many people have left a role with a half-arsed handover? What do you think of that looking back? Not positive I guess and likely clouds your opinion of that person.

Now if you can, think of someone that left on a high, doing great work, still putting in (even more) energy. What do you think of them? Someone you’d like to hire again or work with in the future? That’s the value of the End being strong and not just fizzling weakly out.

Hey, all’s well, that ends well, as they say.


Enjoyed this post?
Say thanks by sharing, clicking an advert or checking out the links above!
Costs you nothing, means a lot.

Image stolen from -

Monday, 30 March 2015

Actually, there ARE Best Practices.

Being the contrarian that I am, I of course think there ARE best practices. However, as is often the case, it comes down to a matter of what’s meant by it and the position you take on using the phrase. I’ll admit before the below, I've stated there are no best practices, but I was lying slightly.

As I recall it, about 5 or 6 years ago, when the re-thinking around Best Practice kicked-off in the test community, it was timely. We were definitely in an era when the Big Consultancies had to be given a firm ‘No’ to the restrictive tools and process models that were becoming the norm. I don’t think it’s an overstatement to say if the push-back hadn't happened, we’d be under the heel of consultancies driving the definition of Best Practice and rolling out the tools to carry it out.

It was already happening, both HP and Microsoft had their respective tools and companies like PWC were adding in processes they would impose on engagements. Best Practice had become defined by whatever those with the strongest grip on senior management said it was. What they said it was; was a way to ensure success. What it actually was; was whatever increased their footprint and made money of course. It wont take you long to recall from memory glorious failures of the big consultancies. A quick Google search will turn up more.

Sadly even within the test community itself it looked like that was happening too. ISTQB splitting out into a set of exams was starting to position itself as the de facto exam set that defined the testing profession’s education. Some luminaries of the testing profession helped put the original Foundation exam together, then stepped back and started to attack it when they saw what the real agenda might be.

The backlash and war cries were understandable. Every revolution starts with trashing the old ways and thinking and attempts to reshape or reclaim certain ideas and words. 

However, often times the thing being fought against are still there. You can shout “There’s no such thing as best practice!” all you like, but there still will be and the world will still roll on stating it’s so. Even though I recall blogging the opposite about 9 months ago. There’s a caveat to this argument as always.

Intellectual Vs Pragmatic
You have a choice as a consultant and a testing professional. You can fight and resist or you can go with the flow and subvert the thinking to achieve your aims. I think that’s in the Art of War or something. Feel free to hit the forums and blogs and engage in the robust intellectual arguments that go on about best practice, testing v checking, waterfall over scrum over agile over whatever this week. In fact, please do. Don’t be intellectually lazy. 

You HAVE TO understand the various viewpoints to get good at what we do.

As a test professional, you have an obligation to yourself and the community to understand as thoroughly as possible, the arguments and thinking, so you can apply them in practice. You have to gain as much understanding as possible and then subvert it for your own aims.

You also have an obligation to be pragmatic.

Taking a rigid, positional stance, that there is no best practice is where this line of argument hits the rails for me. When I go and meet a potential client for the first time, (assuming I’m not hearing otherwise from them) you better believe I talk about best practice, I say testing when I mean checking, heavyweight agile and more. If I didn't, I’d be out of the room before the conversation got started, never mind getting engaged to deliver. You want to open client discussions or berate an existing client for using terms we know are inaccurate? Bye bye job prospects.

You better get good at hearing what people think they mean when they say best practice, testing, performance, security, etc. Find that out by asking probing questions. You might have to talk in terms of best practice, perhaps you can avoid doing so. Later on, when you have their confidence, then is the time to bust out the real truth of matters and reveal the rich depth of nuance. Be pragmatic, you can win the argument and change people’s minds only when you've secured a relationship with them first.

So, what’s Best Practice?
Best Practice is anything that’s been tried and proven to work. That’s it. You know about writing a test plan, test schedule, test cases, checklists, raising defects, tracking progress, regression testing, issuing a test completion report, managing a backlog, creating a burn-down, etc? You know how these things are pretty much approached in the same way, give or take a few details? Best Practice.

Best Practice is a model, an approach to a problem that is known to make it more likely you’ll resolve the problem or complete the task.

Did you notice what I omitted there? What you shouldn't be looking for is prescriptive detail on top of this. The exact content of status report, the precise steps to perform analysis, the specific choice of colours for reporting. It’s nonsense to think it could be any other way, not even IEEE 829 or the new IEEE 29119 go that far, so stop looking.

Best Practice is what is in your box of tricks, that you've learned to apply over the years and that you know how to modify to fit the client’s needs.

You are building that personal tool kit right? It’s best practice don’t you know.


Enjoyed this post?
Say thanks by sharing, clicking an advert or checking out the links above!
Costs you nothing, means a lot.

Sunday, 29 March 2015

Understand the context of the Testing Problem

In various other posts and papers, I've said that testing is testing. That if we’re suitably skilled and experienced in our profession, then we can test in most any environment and industry. I believe this remains as true as when I first said it.

To quote directly from the FX Kickstart for Testers paper;

As a competent test professional you’ll know that fundamentally you can test in any domain. That domain could be travel, healthcare, engineering, retail, etc. and of course finance. Ultimately we can say with sincerity, it’s all software testing and the skills and knowledge of our profession that we possess, apply no matter what the domain.

However, there are two very real caveats with this perspective that will limit our effectiveness in a given domain until overcome:

  • We will initially lack understanding of the testing challenges in the domain, thereby limiting our application of contextually relevant testing techniques and approaches
  • The domain will have unique vocabulary and concepts that must be understood if we are to translate its meaning and communicate effectively
Once we address these caveats we will understand the unique testing needs of the domain in a contextually relevant way and be able to communicate with our colleagues effectively.

The point is, to deliver effective and pragmatic testing we need to understand the client’s business. Let’s look more closely at the caveats above.

A lack understanding of the testing challenges in the domain
A mistake inexperienced consultants make, is to roll into a client’s site with a pre-concieved idea of what a testing solution will look like, yet they have scant knowledge of the problems the client are facing.

We see this with many of the larger consultancies. Their team arrive with all the answers already cooked up. The direction a solution will move in, the shape it will take, tools to be used, etc. Are all mostly known in advance. That’s not always a bad thing. If it’s known the testing problem is  the integration of new development with existing systems, then certain approaches for this will be known. Maybe the problem is specific, a lack of security testing for example. In which case there are some industry standard, yes even best practice, approaches to take.

On a new engagement it’s important to take the time to put off solutionising and ensure we’re clear on such things as:
  • The nature of the problem is known
  • What problems are being caused to current business operations, what limitations it is causing
  • What is limiting the client from solving the problem themselves
  • What attempts have been made to solve the problem and what the outcome was
  • Timescales for the problem needing solving and why, is anything dependent on the test work
Unique Vocabulary and Concepts
I stated above this applied to our ability to understand the language of the business to ensure clear communication, in a dialect the business would understand. It’s more than that however. Just saying an environment is System Integration instead of Development Integration or the other way around is not the big issue. Though it certainly helps to get that right too.

The big issue is with the concepts and I would expand that to be quite inclusive; the ideas, ways of thinking, ingrained culture, internal politics, external pressures, etc. Understanding these are just as important to the success of your consulting engagement as knowing what the testing problem actually is and how they talk-testing in the business.

We should check if we’re in need to understand other aspects such as:
  • What kind of environment the client has in place culturally; open, communicative, blaming
  • How they work technically; adaptive / descriptive, predictive / prescriptive, hybrid
  • If they are subject to professional or legal compliance and auditing
  • Any issues with public perception or on a branch / regional basis
Have a look at the free preview pages for Flawless Consulting, scroll down to where it says 'Consulting Skills Preview'. This is valuable additional reading.

Testing is just like – software testing. We can be confident in that.

What changes is the setting in which the testing problem we need to solve exists. To ensure our testing is relevant and effective, we serve the client best when we understand the context of their testing problem.

Enjoyed this post?
Say thanks by sharing, clicking an advert or checking out the links above!
Costs you nothing, means a lot.

Replacing QA with 100% automation

Replacing QA with 100% automation

A question came up on one of the forums recently, that echoes a question I and I’m sure you’ve been asked at some point. That question being along the lines of reaching 100% automation. Groan… yes, that old chestnut.

As experienced testers we already know the answer, for the avoidance of doubt the answer is: No.

Thanks for reading, now on with my day!


OK, OK… we wish the ill-informed question could be dismissed so easily.

However, though we know its asked by the ill-informed, we can’t blame people for asking. To senior management, non-technical, non-testing staff it’s a reasonable question. It’s our job to respond in a way that ensures we leave them informed.

To whit… if asked, here’s a few thoughts on why we can’t achieve 100% coverage via automated testing. (We’ll skip the QA Vs Testing perspective for now, hey they’re ill-informed, just roll with it).

You could also send them this link of Case Studies on Automation, by Dorothy Graham (RTFM…? :)

It isn’t possible to achieve 100% test coverage of the application/system with manual testing, so strictly it can’t be possible to do the same with automation. Though interestingly we might expand coverage…

Technical Limitations
There are aspects of the system you can check and test manually, that can’t be automated ‘technically’. Examples include Accessibility and Usabilty, others include complex end-to-end scenarios that might require exploratory or dynamic testing dependent on variable results.

Automation of what?
The different levels of testing (unit, integration, system, etc.) mean we’d need to have suits of different automation tests, focusing on different aspects of the system / application.

Tools and Team Skills
If we wanted to ‘automate everything’ we’d need a host of tools; web front end, application interface, API testing, webserver/DB/CDN testing. Maybe 100% automation is just one of these?

Assuming we define what 100% means in this context, do we even have the tools in place or the staff who know how to use them? If not, there will be some manual testing going on.

Don’t have time / money
The Return on Investment from automating everything just won’t be there. So, even if we had the tools, time, skills and a limited automation remit – it’s highly unlikely the time and money needed would be worth it. Automating all of a payment system that’ll be there for 10 years is more likely than automating a website that’ll be trashed in 6 months.

What other reasons can you think of?


Enjoyed this post?
Say thanks by sharing, clicking an advert or checking out the links above!
Costs you nothing, means a lot.