Archive for December, 2010

2010: A Pedant’s Review

29 December, 2010

I have been reminiscing this evening about the passing of another year.  It has been a year during which I have learned a great deal about myself and my chosen craft.  Many of my experiences have strengthened my understanding of why certain things work well for me in my situation and why other things don’t work so well.

I have been blogging and tweeting more and more during the year as I have realised that the thought processes I am going through seem to be of interest to others which has both surprised and inspired me further.

One of the subjects which has been giving me a great deal of food for thought has been how we get to understand requirements and a lot of this boils down to the way in which we communicate with each other (as members of development teams) and our customers and other stakeholders in the business.  I gained some insight into how to go about uncovering some of these hidden requirements from outside the software testing field: I went to the Old Bailey (the Central Criminal Court in London is in a building called the Old Bailey for any unfamiliar with the term) and listened to some cases being heard.  I followed this up with a visit to the Royal Courts of Justice which is one of the higher law courts in the English and Welsh legal system.  It was fascinating to me to listen to the proceedings and observe the way questioning was pursued.

The testing community throughout the world has been a tremendous source of encouragement and it has been great to read about the proceedings of the various conferences that have gone on during the year.  I have mainly centred my attendance on the Software Testing Special Interest Group (SIGiST) conferences arranged through the BCS (formerly known as the British Computer Society) and the UK Test Management Forums.  These events have all been very valuable to me in my learning and I am grateful to the SIGiST organising committee and Paul Gerrard of Gerrard Consulting respectively for continuing to arrange these events.

Besides the formally arranged conferences a big thank you must go to Tony Bruce for his sterling work organising the London Tester Gatherings.  What tremendous events these are!  It is great to be able to meet up with fellow testing professionals to discuss our craft in an informl setting.

It has been a privilege during the year to help out with proof-reading and reviewing articles for The Testing Planet, the newspaper produced by the good folks at the Software Testing Club.  Again, this is another vibrant community of testers from all over the world and it has been great to be associated with this.

European Weekend Testing, organised by Anna Baik and Markus Gärtner, has provided a safe place in which to practice the software testing craft.  The missions on a weekly basis have always been challenging and a great way to hone existing skills and learn new ones.  Unfortunately time has not been available to keep these going on a weekly basis but I hope to get along to future sessions whenever I can.  The other weekend I attended a Weekend Testing Americas session hosted by Michael Larsen.  It is a great way to learn from others and become better craftspeople while we are at it.

It was from one of the European Weekend Testing sessions that I realised I needed more help with understanding a technique called ‘Transpection’ which I had read about on James Bach’s website.  I contacted James on Skype to ask for his help and I was able to add that to my armoury for further use.

Through all of these events and activities I have met and chatted online to some amazing people – you all know who you are – thank you for all your support over the past year.  I would like to take this opportunity to wish you all a very happy and prosperous 2011.

Happy testing!


The Testing Planet – Annual Subscriptions Available Now

21 December, 2010

A great and fun way to advance your education in the software testing craft is to read well written articles by well-respected testers.

The Software Testing Club has introduced an annual subscription priced at £15 for UK residents; £21 for residents in the rest of Europe; and £25 for the rest of the world.

For more information go to or e-mail

Digital copies continue to be freely available for download but there is something nice about a printed copy though!

User Experience Testing: Communicating Through the User Interface

20 December, 2010

One of my many interests is how the individual parts of systems – whether they be software-driven or not – communicate with each other.  A lot of time and energy is spent on making sure that the software components work together in different situations but how much time do we devote to making sure that the systems all work together cohesively to form an entire process?  How much time is spent making sure that the process itself can work correctly?

One of the things that I think we need to plan more time for is testing the way humans interact with systems.  I know that this is not easy because time is short and there is a lot of pressure to make sure the computer software side of the system is working correctly – the rest can be handled with training so the argument goes – but I think as testers we should keep bringing the human side of systems to the table in meetings and discussions about the projects we are involved in.

The human side of systems is something that ‘just happens’ when everything is going well but when it all goes wrong the results can be spectacular.  My favourite example of this is the London Heathrow Terminal 5 opening debacle.  A lack of familiarity by staff and passengers about car park locations led to baggage build up because the people were not in place at the right times to move bags around the baggage system.  This in turn caused a heavy load on the baggage belts leading to a failure of the automated baggage delivery system and so on…  Testers, as the eyes and ears of a project, should be vigilant for situations that no-one else has thought of and raise them.  Of course it is possible that the testers on this project had asked these questions and nothing was done about mitigating the risks, but everybody did seem to be taken by surprise at the turn of events on T5’s opening day…

Let’s move on now to another aspect of human-computer interaction: messages and warnings.  I am sure we have all been bemused by the sight of an error message that just says: “An error occurred.” However, put yourself in a user’s shoes for a moment and think about how you would react to seeing the following (I have pulled this from my Application Event Log but the text is pretty much as I  remember it appearing on screen in the form of an error message):

Faulting application name: OUTLOOK.EXE, version: 12.0.6539.5000, time stamp: 0x4c12486d

Faulting module name: olmapi32.dll, version: 12.0.6538.5000, time stamp: 0x4bfc6ad9

Exception code: 0xc0000005

Fault offset: 0x00051c7c

Faulting process id: 0x1a18

Faulting application start time: 0x01cb9c9e50018c07

Faulting application path: C:\Program Files\Microsoft Office\Office12\OUTLOOK.EXE

Faulting module path: c:\progra~1\micros~2\office12\olmapi32.dll

Report Id: 0c733b39-0896-11e0-b5bf-00197ed8b39d

The practice of delivering such ‘techie’ messages to end users is common-place but in my opinion it is a bad approach.  Receiving messages like this is completely bewildering for novice computer users who are likely to panic and do something that really messes things up.  In my case I knew that it was an add-in that I had installed which was incompatible with Outlook 2007 and did not panic – I understood what I had to do and I got on with it but it made me think of my less experienced friends and colleagues and how disconcerting such a message would be for them.

Let me give you another example from my Application Event Log:

The application (Microsoft SQL Server 2005, from vendor Microsoft) has the following problem: After SQL Server Setup completes, you must apply SQL Server 2005 Service Pack 3 (SP3) or a later service pack before you run SQL Server 2005 on this version of Windows.

I would like to encourage you all to think carefully about the wording of error messages and warning that are displayed to users.  The above is not a ‘problem’ at all; I need to do something else before I can run SQL Server 2005 and there is no cause for alarm.  It might be argued that someone seeking to use SQL Server 2005 is bound to be a competent computer user and therefore does not need much help but I beg to differ.  I might have been given a task to do for which I am completely out of my depth and I do not need to be panicked further.

There is a fine balance to be reached between being able to give enough information so support professionals and developers can debug and understand how to fix a problem (which Microsoft may have done with their message from Outlook above – assuming they are all well versed in hexadecimal) and being informative to users.

If we get the user experience right we stand a much better chance of designing and implementing a system which really does work efficiently because people will not be wasting countless hours trying to understand cryptic messages coming back from the system; they will be less frustrated; and everyone will have a better perception of the system and the organisation that is using it.

I was in a supermarket a few weeks ago and I heard the remark from a fellow customer that “there are always problems at the tills here – nobody seems able to work them”.  Standing in the queue I could see where that perception would come from: two till operators and a supervisor were needed to make sense of a message that had come up on the screen.  Testers should be making more noise about human-computer interaction and user experience problems that they can foresee for the future good of our craft.

This is an area that I am striving to get better at and I hope there are other testers out there who give serious consideration to the user experience they are giving in their systems.

SIGiST 8 December 2010

9 December, 2010

The final SIGiST (Special Interest Group in Software Testing) conference took place yesterday in London and, as usual, was well worth attending.  The theme for the day was “Keynotes – Six of the best” and consisted of talks only on this occasion: six keynotes and one short talk after lunch.  Unlike other SIGiST conferences I have been to there were no workshops which I always enjoy but I still found the day inspiring.

Four talks stood out for me as being excellent:

Les Hatton from Kingston University gave a brilliant talk in which he cited the systems controlling space shuttles as an example of excellently engineered systems and then went on to talk about systems which “should never have been allowed to see the light of day”.

One of the ‘highlights’ (that should probably read ‘lowlights’) included the story of his passage through Heathrow Airport earlier this year.  He had printed his boarding card at self-check-in but the systems at security could not read the card; SAS (the airline he travelled with) could not issue a new boarding card because he already had one unless he gave them the assurance he was who he said he was (!); he was then unable to get through security because he had two boarding cards…  As if it could not get any worse the public information displays in the departure lounge had crashed.  As a keen traveller I could really identify with Les’ frustrations here!

The final part of Les’ talk encouraged us to focus on systems thinking and take some of our cues from the laws of physics.  Once you find a bug in a particular area of the system you are likely to find more bugs in that same area.  Don’t give up was the message.

Gojko Adzic gave his excellent talk on Specification by Example.  Once again he made very good use of Prezi and encouraged us to use clear and concise language that our colleagues and customers will understand.   Too much time is wasted by misused terminology.  In a later talk mention was made of test cases and test conditions – actually they could both have been referring to the same thing – so why distinguish between them?

As usual Gojko had lots of examples to illustrate the success of the technique.  I find the concept of ‘living documentation’ particularly valuable and I liked the example of customer service staff referring to the tests that had been run to help answer customer queries.  It makes the tests very powerful because each test is directly addressing a particular problem being faced.

In the afternoon Fran O’Hara from Sogeti Ireland gave a talk on Scrum.  Included in the delegate pack for the conference was a Scrum cheat sheet illustrating the different components of a Scrum project and showing how they fit together.  I took it into work today and our Project Manager has found it very helpful. I thoroughly enjoyed Fran’s talk and particularly liked the idea of having two definitions of ‘done’: one definition describes what it means to be ‘potentially shippable’ and the other defines what ‘done’ means in terms of the current sprint.  There are many projects where it is not feasible to produce a potentially shippable product after one or two three-week cycles and this helps to deal with that.

Susan Windsor from Gerrard Consulting finished the day talking about how we develop ourselves and what it means to be a really good tester.  Susan challenged each of us to become testing gurus, super-testers in our organisations.  This will pay dividends because of the tremendous knowledge that we can bring to the table of how our projects are really going whether we are working in a traditional or more agile context.

Susan discussed the certification issue briefly and reminded us that although we can get a sense of achievement out of having a certificate one of the biggest problems with certification is the fact that it is used as a screening mechanism by people who really know very little (if anything) about testing when hiring staff.  Personally, I would add that the syllabus is too restricted in its scope and is based on very traditional testing processes which have been shown to be less efficient than the agile methods being adopted more and more.

Other talks included a career progression report from Erkki Poyhonen where he experienced a paradigm shift without a clutch (cue a Dilbert cartoon), a report of an entity-relationship modelling exercise for testing effectiveness from John Kent and a short talk from Geoff Thompson on the things that have influenced him in testing.

As always the day ended in the Volunteer where I enjoyed continuing chatting to fellow testers about our exciting craft.  The best thing for me about SIGiST is the networking and getting to know other testers.  As a result of attending these and other conferences I have built up a network of people with whom I communicate regularly and it has really expanded by knowledge of my chosen craft.  I would encourage everyone to get involved with testing conferences in their various locations because together we can learn a tremendous amount.

Transpection Explored

6 December, 2010



I had a great learning experience last night.  Those of you who were at the European Weekend Testers session or who have read my write-up of the session (here) will know that I attempted a technique called Transpection which I had read about previously at

I was not very happy with my attempt: it just did not feel right.  One of the great things about Weekend Testing sessions is that you can try new things in an environment where it does not really matter and everything becomes part of the learning process.

I decided to solicit the help and advice of James Bach to see where I went wrong and understand what I should have been doing so contacted him on Skype.  He readily agreed to help me and demonstrate Transpection for me.  I am publishing the full transcript of that Skype session (see link above) at James’ suggestion because we think it will be of help to other testers.

I have only lightly edited the transcript to put some of the statements into paragraphs to aid readability but the content is all there for you to see.  You will see my own learning process through this and hopefully, for those that want to know more about the technique, understand this really useful aid better.  You will even see where I mistakenly thought James was trying to bring proceedings to a close!

I would like to thank James for his time yesterday evening and for supporting me in this quest.

Feel free to make comments or ask me any questions…

European Weekend Testing – 4 December 2010

4 December, 2010

Due to various commitments over recent months I have not been as regular an attendee at the Weekend Testing sessions as I would have liked.  However the session this afternoon was a great one to come back on.

Our mission was to devise a ‘cheat sheet’ to be used by Helpdesk staff to help them improve the quality of their defect reports.  A lot of questions were asked to clarify what the problems were at the moment, what sort of environments the Helpdesk staff had available to them, whether there were any language issues to be considered, etc.

Ajay Balamurugadas ( and suggested working on this in teams and I had the pleasure of working with him during the course of the afternoon.  We quickly got down to drafting our cheat sheet, starting off by each typing our ideas for what should go into the sheet using a brilliant tool which I had not seen before,, which allowed us to both work on the same document and see what each of us was doing in realtime.

Ajay asked me to note down the sort of information that I would ask for if he called me for technical support.  My answers spurred us both on and we became much more productive in thinking up the things that would need to go onto the cheat sheet.  Similarly I asked Ajay about how he deals with severity.  In some ways our conversation reminded me of James Bach and Michael Bolton’s work on Transpection ( which is a technique I am trying to master.  I really need more practice…

We categorised and smartened up the cheat sheet and Ajay prepared it all as a PDF that we could share with the other weekend testers.

Following this we had the de-briefing session which, as always, was as informative – if not more so – than the actual mission itself.  I find I learn so much from hearing about how others have tackled problems and finding out how they have put their knowledge of the testing craft to good use.   We had all taken slightly different approaches depending on how we each viewed the audience and what they were trying to do.  Ajay and I had focussed on the Helpdesk staff writing bug reports but others had concentrated on helping the Helpdesk staff get the right information out of the customer in the first place.

The whole session was really enjoyable.  “Thank you” to Anna Baik for facilitating the session and to all the contributors for their help during the afternoon.  I look forward to joining future sessions as time and circumstances permit.