mheusser: “So @chrisg0911 want to attend WHOSE?” , that’s how it all started. A tweet from Matt Heusser that led to one of the most intense, challenging yet fun and enjoyable events of my professional life so far.
WHOSE: WorksHop On Self Education in software testing. This was an Association for Software Testing workshop organised by Matt Heusser. We were charged with the task of creating a ‘skills inventory’ for the AST to be used by testers for their personal development. Matt likened the inventory to the homeschool curricula, where there is a set of topics with resources that they can pick and choose.
I’ve never attended a peer workshop before, and I really did not know what to expect. Frankly, the prospect of being in a room with a group of super-good independent testers was somewhat intimidating.
So on Thursday 7th December, 14 participants congregated at Hyland Software in Westlake, Ohio. This was to be our place of work for the next 2½ days.
Meeting, greeting and reassurance
We started the day with round-the-room introductions, and I was pleasantly surprised that there were only 4 independents. Most of the others were permanent testers in regular companies. A couple of the guys were in similar situations to me with regards to problems they faced at work, numbers of testers etc. This was all very comforting. The rest of the team were Jon Hagar, Peter Walen, Erik Davis, Nick Stefanski, Simon Peter Schrijver, Robert Sabourin, Justin Rohrman, Jeremy Carey Dressler, Alessandra Moreira, Jess Lancaster, David Hoppe and Doug Hoffman
After the introductions, Jon Hagar gave a presentation to cover existing inventories such as various ISO, IEEE standards, SwEBok (software engineering book of knowledge) and ISTQB. Jon indicated that there were several similarities between all of these, including they all have some form of fundamental skills. This information was really useful in a few ways, but mostly to serve as what we did not want this inventory to be; static, theoretical, process driven and out of date.
What’s in a skill?
Following this, Matt drove a discussion around what we defined as a skill. We came to a consensus that a skill (for our purposes) must be defined, specific, isolated, demonstrable and something that can be improved upon. This definition was to play a large part throughout the workshop.
At this point it was time to do some work. As a group we brainstormed a bunch of skills on index cards, then peer reviewed them, explaining our rationale behind choosing them.
All of the cards were then laid out, there were quite a lot. We swarmed on the next activity which was to try and group similar skills. This took a considerable length of time to complete. It was not always obvious what some of the skills were, and some of the cards weren’t strictly skills either.
The task for the remainder of the afternoon was to divide into groups, take a set of skills and start writing a definition for it and links to reference material into our wiki. This was an interesting experience as it was the first time many of us had worked together. Some groups functioned better at this than others.
By the end of the day, we had generated several skill entries into the wiki, at which point we presented what we had created. There was a mix of quality for this work, but generally it seemed to go well.
Putting it into context
I woke up early on Friday morning, and just wasn’t feeling it. After spending a lot of the previous night thinking about where we were heading with this repository of skills, it just didn’t float my boat. I didn’t feel that I could go back into work in the UK, and tell my boss that I was proud of the work we did. This continued over breakfast where I found that I was not the only one with this feeling.
Thankfully, this seemed to be a universal feeling. Matt confronted this once we got to Hyland. It turned out, the thing we were missing was something that we hold very dear to how we test; Context!
This really struck a chord. I’d been of the opinion from the start that this set of skills needed to be usable by real people in real jobs in real companies with real time pressures. Adding ‘stories’ to the skills puts it into context for the reader. It gives an example of why we used that skill, how, and what the outcomes were. This was real.
So, in true lean style, we failed fast and started again. We spent the rest of the day in small groups, pairs or individually writing skills entries where we had a story to tell. This worked really well, especially getting them peer reviewed which was really useful, especially for those of us who are not as proficient at writing.
By the end of the day, everybody (I think without exception) was mentally spent!
|The WHOSE team hard at work|
We started Saturday where we left off with more skills work, but we quickly decided that the remaining half day we had together was better spent deciding where to go from here. We pretty much all agreed to commit to working on this moving forward which was really encouraging.
It certainly sounded like everyone was really happy with the work we’d done, and for me personally I felt I could return to work proud of what we’d achieved. On another personal note, I met some great people at the workshop, and had some really interesting conversations. I’m sure I’ll keep in touch with a few of them. My first experience of a peer workshop was really positive and I hope it is the first of many!
It was clear that this was a really mammoth task, and one we wouldn’t be able to finish in it’s entirety in 2.5 days. However, what we did achieve was to try something, evaluate it, pivot to something better, and settle on a format that really worked for us.
Context is key to everything, and it will be what sets this repository of skills apart from the others. To make this useful to the regular testers on the ground, it has to appeal to their current situation; make them relate to one of the stories which subsequently encourages them to learn that skill.
Work hasn’t stopped since the workshop and there is plenty still to do. We have yet to decide on the final delivery mechanism for the repository, but the intention is to make it available some time early-mid 2014. Look out for updates from AST.