Posts Tagged ‘conferences’

Dutch Exploratory Workshop in Testing (DEWT) 3

22 April, 2013

This is an expansion on my post published on the Allies Computing Ltd blog: http://www.alliescomputing.com/blog/general-systems-thinking-dutch-exploratory-workshop-testing/

 

The third Dutch Exploratory Workshop in Testing took place over the weekend 20 – 21 April 2013 after an informal gathering the previous evening for drinks and testing chat.

The theme for the weekend was Systems Thinking and I was glad I had taken the time to read Gerald Weinberg’s book “An Introduction to General Systems Thinking” to prepare myself. I also had the opportunity to discuss General Systems Thinking with James Bach on Friday evening before the conference and to reflect overnight on our conversation. This proved very useful mental preparation for the day ahead so thank you James!

The Saturday started with a check in where we introduced ourselves and explained any impediments that there might be to our ability to contribute or concerns that we had about the weekend. James Bach also gave us a primer explaining for everyone what General Systems Thinking is.

Having established ground rules for the weekend, appointed Joris Meerts as programme chair/content owner, and agreed facilitation points of order Rik Marselis kicked the main part of the conference off with a discussion of things that he had learned about General Systems Thinking and some examples of situations he had witnessed in different organisations which led to quite a lot of discussion.

After lunch we were shown an excellent model by Ruud Cox of the stakeholders in a couple of projects he has been involved with, their interactions and their areas of interest. Ruud explained how the model helps him establish test ideas and shows him areas the system where problems might be less tolerable (in someone’s opinion).

We also had an experience report from Derk-Jan de Grood on a force field model he used to help him visualise stakeholders in a project he was involved with and remind himself of whom it is important for him to maintain contact with.

James Bach followed this up with a further experience report showing us how he and Anne-Marie Charrett have applied Systems Thinking to coaching testers. It was fascinating to see how quickly a model of something ‘simple’ could expose so many different surfaces of the system and interfaces that could easily pass you by. One that struck me particularly was a feedback loop that applied to both the coach and student labelled ‘Self Evaluation’. It is something that could easily be overlooked but it is happening subconsciously all the time and is critical, in my view, to how well such a coaching training system evolves.

After voting on topics to discuss in presentations on Sunday we broke off for the day finishing with dinner, some testing challenges and more drinks.

Sunday started off with an experience report from Michael Phillips on some of the dogmas that he has seen potentially arising in the companies he has recent experience with. The attitudes he gave as examples were twofold:

  • Testers are seen as not being able to keep up with the pace of development; and
  • Testers are seen as a danger to a deployment because they might disrupt a continuous integration cycle.

James Bach made the suggestion that the first could be countered strongly by turning the argument round and saying that developers were not keeping up with the risks being introduced. The other important thing testers can do in this and many other testing situations is work hard on their credibility and building their reputation.

Joris Meerts gave an excellent presentation on what it means to be a good tester and ways we can know we are a good tester. Much of this focussed again on reputation, skill and influencing and impressing the right people.

This tied in very nicely to James Bach’s talk after lunch on how he built credibility by gaining repute as someone technical but also by upholding professionalism.

Next we had a report from Huib Schoots on the recruitment of testers and the things he looks for when he is hiring. For example what are they doing in the wider testing community? Are they tweeting, blogging, writing articles relating to testing? It was suggested that interesting insights might be gained by asking candidates what is on their minds at the moment.

All in all the lessons I have learned from the weekend:

  • The ability to do Systems Thinking is one of the most important skills for a tester to master;
  • Do not just strive to be good at something – go for excellence;
  • Think about the people involved in designing, building and using the systems we are working on;
  • Discussing testing with passionate people and getting to know them over a weekend is very valuable and rewarding for me personally; and
  • I need to spend more time reading and thinking about General Systems Thinking.

In conclusion I would like to thank the Dutch and Belgian testers – particularly the founding DEWTs – for inviting me to Holland to join their discussions. It was a privilege to get to know you all and gain some credibility amongst such a group. I hope you will consider inviting me again in the future!

SIGiST Conference 13/03/2013

14 March, 2013

Good session at the Special Interest Group in Software Testing (SIGiST) conference yesterday run, as usual, by the BCS (British Computer Society) with a good representation of testers from across the different project lifecycles.

Matt Robson from Mastek gave an opening keynote under the title “Be Agile or Do Agile” and gave some salutary warnings on the dangers of testing becoming an ‘ology’. It is very easy to become set in our ways and dogmatic about our approaches to testing and that is harmful. To be ‘agile’ does not just mean that we adopt the Agile Manifesto (http://agilemanifesto.org/) and follow an agile approach to project management; it means we think and act in a way that embraces change and adapts to the situations we are in.

Very often we forget the ‘people’ side of software development and the example was given of a company where senior management turned round one day and said ‘we’re going to go agile and this is how you’re going to work in future’ but didn’t get the staff on board with them. The consequences on staff morale were horrendous and as a result software quality dipped.

One of the ways we can do this is to think in terms of business goals and outcomes because that has meaning for people. For instance instead of saying ‘the registration widget looks broken; I advise against going live’ approach it more as ‘we have found instances where sales staff might not be able to register new customers on our system; I advise against going live’.

What was particularly good was the talk was done with no PowerPoint slides so it concentrated the mind far more. I think this is an area that testers really have to get good in but it is also an area that can easily go horribly wrong.

Next we had a talk from George Wallace on systems challenges going from an R & D product straight into production. The project was for a very large and complex system and it was being developed in a very traditional manner with testing entering the fray late on in the product’s lifecycle. Suffice to say that testing was supposed to take 3 months and they are already 9.5 months in and still going!

Sakis Ladopoulos from Intrasoft International talked to us about what he termed project agnostic test metrics for an independent test team. Essentially this was an attempt to measure the performance of testers working on projects but do so independently of how the whole project is performing. The way this was being approached was to normalise scores for whatever was being counted across all the projects the team was involved with. The ‘best tester’ was the one who found the highest number of bugs very often as a percentage of the time they had taken.

I was quite uncomfortable with the idea of detaching testing from the rest of the project team but I am just not used to working like that.

After lunch we had a talk on website testing from Balaji Iyer from Mindtree talking about the challenges faced by modern websites. In particular there was discussion around scripting challenges and how performance can be impacted by technologies such as Ajax (used extensively by Google), Flash (for instance You Tube) and JavaScript which is often used for making sites look ‘pretty’.

Mindtree have a module currently in development that works with JMeter (a popular open source load and performance testing tool operated under the Apache banner) to help testers parameterize their requests and correlate them.

Chris Comey and Davidson Devadoss from TSG (Testing Solutions Group) then gave a fascinating talk on test strategies and how we can write a strategy which looks great on paper but does not work at all in practice because we have written our strategy looking in on testing and have not thought about dependencies on other parts of the business and how to deal with the issues that arise as a result.

It was a great talk because both test strategies that were used as examples were good strategies; they just weren’t the right ones for the job in hand. There is little point in doing a Post Project Review either if, as in the case of one of the projects, you are just going to type it up then stick it on a shelf somewhere and not learn the lessons. All the failings in one of the projects looked at had been raised in a previous ‘lessons learned’ document. Perhaps it would have been better to have called it a ‘Lessons NOT learned’ document!

The closing keynote from Martin Mudge from Bugfinders.com was great and was talking about crowd sourcing services to get testing done quicker and perhaps with greater coverage. With some audience participation from 3 people all had different paths that they would take through a functionality diagram.

The testers come in from all over the world via registration and are selected for projects based on their skills, experience and ability. Defects that are raised are all re-tested to verify that they are repeatable as recorded and genuine issues. Testers receive training materials if there are problems with their testing.

This seems to be a particularly good way for small teams wanting to quickly catch user interface issues but it would certainly be dangerous to rely on crowd-sourcing for deeper level testing (and I can just see some companies getting the impression that is the way to go)!

Overall a good conference and plenty to take away and think about.