Archive for the ‘UX Testing’ Category

User Experience Testing: Communicating Through the User Interface

20 December, 2010

One of my many interests is how the individual parts of systems – whether they be software-driven or not – communicate with each other.  A lot of time and energy is spent on making sure that the software components work together in different situations but how much time do we devote to making sure that the systems all work together cohesively to form an entire process?  How much time is spent making sure that the process itself can work correctly?

One of the things that I think we need to plan more time for is testing the way humans interact with systems.  I know that this is not easy because time is short and there is a lot of pressure to make sure the computer software side of the system is working correctly – the rest can be handled with training so the argument goes – but I think as testers we should keep bringing the human side of systems to the table in meetings and discussions about the projects we are involved in.

The human side of systems is something that ‘just happens’ when everything is going well but when it all goes wrong the results can be spectacular.  My favourite example of this is the London Heathrow Terminal 5 opening debacle.  A lack of familiarity by staff and passengers about car park locations led to baggage build up because the people were not in place at the right times to move bags around the baggage system.  This in turn caused a heavy load on the baggage belts leading to a failure of the automated baggage delivery system and so on…  Testers, as the eyes and ears of a project, should be vigilant for situations that no-one else has thought of and raise them.  Of course it is possible that the testers on this project had asked these questions and nothing was done about mitigating the risks, but everybody did seem to be taken by surprise at the turn of events on T5’s opening day…

Let’s move on now to another aspect of human-computer interaction: messages and warnings.  I am sure we have all been bemused by the sight of an error message that just says: “An error occurred.” However, put yourself in a user’s shoes for a moment and think about how you would react to seeing the following (I have pulled this from my Application Event Log but the text is pretty much as I  remember it appearing on screen in the form of an error message):

Faulting application name: OUTLOOK.EXE, version: 12.0.6539.5000, time stamp: 0x4c12486d

Faulting module name: olmapi32.dll, version: 12.0.6538.5000, time stamp: 0x4bfc6ad9

Exception code: 0xc0000005

Fault offset: 0x00051c7c

Faulting process id: 0x1a18

Faulting application start time: 0x01cb9c9e50018c07

Faulting application path: C:\Program Files\Microsoft Office\Office12\OUTLOOK.EXE

Faulting module path: c:\progra~1\micros~2\office12\olmapi32.dll

Report Id: 0c733b39-0896-11e0-b5bf-00197ed8b39d

The practice of delivering such ‘techie’ messages to end users is common-place but in my opinion it is a bad approach.  Receiving messages like this is completely bewildering for novice computer users who are likely to panic and do something that really messes things up.  In my case I knew that it was an add-in that I had installed which was incompatible with Outlook 2007 and did not panic – I understood what I had to do and I got on with it but it made me think of my less experienced friends and colleagues and how disconcerting such a message would be for them.

Let me give you another example from my Application Event Log:

The application (Microsoft SQL Server 2005, from vendor Microsoft) has the following problem: After SQL Server Setup completes, you must apply SQL Server 2005 Service Pack 3 (SP3) or a later service pack before you run SQL Server 2005 on this version of Windows.

I would like to encourage you all to think carefully about the wording of error messages and warning that are displayed to users.  The above is not a ‘problem’ at all; I need to do something else before I can run SQL Server 2005 and there is no cause for alarm.  It might be argued that someone seeking to use SQL Server 2005 is bound to be a competent computer user and therefore does not need much help but I beg to differ.  I might have been given a task to do for which I am completely out of my depth and I do not need to be panicked further.

There is a fine balance to be reached between being able to give enough information so support professionals and developers can debug and understand how to fix a problem (which Microsoft may have done with their message from Outlook above – assuming they are all well versed in hexadecimal) and being informative to users.

If we get the user experience right we stand a much better chance of designing and implementing a system which really does work efficiently because people will not be wasting countless hours trying to understand cryptic messages coming back from the system; they will be less frustrated; and everyone will have a better perception of the system and the organisation that is using it.

I was in a supermarket a few weeks ago and I heard the remark from a fellow customer that “there are always problems at the tills here – nobody seems able to work them”.  Standing in the queue I could see where that perception would come from: two till operators and a supervisor were needed to make sense of a message that had come up on the screen.  Testers should be making more noise about human-computer interaction and user experience problems that they can foresee for the future good of our craft.

This is an area that I am striving to get better at and I hope there are other testers out there who give serious consideration to the user experience they are giving in their systems.