The nine skills of exploratory testing

Exploratory testing is a learned skill, as I claimed in my previous post “Being intentional about exploratory testing”. In that post I mentioned the importance of two skills: noticing what there is to notice and deciding what to do next. Turns out it’s not the first time I mentioned that pair of skills. In a post about how to teach Agile, I quoted John Mason’s “Researching Your Own Practice, The Discipline of Noticing”:

All professional development could be described as changes in sensitivity to notice and accumulation of alternative actions to initiate.”” (p. 147)

That does raise the question if the skills of exploratory testing can’t be made a little more specific. After giving it some thought, I came up with seven additional skills, making a total of nine. For some reasons they ended up as questions rather than nouns. I like how that makes this post less of a checklist and more of a tool for self-reflection. Each skill could be its own blog post, so I’m going to focus on one key element of each skill.

Read more…

Being intentional about exploratory testing

This is the second post in a (to be) three-part series about my statement “The difference between a test case and a requirement is the moment of discovery.”

In the previous post I distinguished test cases that are translated requirements from ones that aren’t. This is something I learned from James Lyndsay. As he describes in “Why Exploration has a Place in any Strategy”:

Some tests are designed to find risks. They’re made on-the-fly and run once. Some are designed to tell us about retained value. They’re made once, and run forever after. You need both: they tell you different things.

The tests with a focus on value are based on requirements, on things we know we want, they are prescribed (as in: written before). The tests with a focus on risks are exploratory, they are based on our decisions in the moment, we look for surprises and decide how we feel about those surprises.

One thing I’ve noticed through the years, is that a lot more exploratory testing is happening than we give credit for. It’s hidden, a required but implicit part of the work. We do it, but we’re not intentional about it.

Today I want to argue that it pays to be more intentional about exploratory testing. Before I get there, however, I want to explain what exploratory testing is, because there are still plenty of misconceptions going around.

Read more…

The Fluxx ensemble exercise

Earlier this week I ran a full-day workshop at the excellent HUSTEF conference on working in an ensemble (aka mob programming/testing or software teaming). As part of the workshop I tried out a new exercise, in which participants were allowed to change the rules of the ensemble. The goal was to experience why the basic rules of ensembling are the way they are and what happens if they are different.

Since the participants really liked the exercise, I figured I’d write about it and name it: the Fluxx ensemble exercise. For those not familiar with Fluxx: it is a card game in which changing the rules is a key part of the game. It’s one of my favourite games.

Before I go into the exercise, though, I’ll first need to explain the basic rules of ensembling.

Read more…

What do you fix when you fix a test?

You ran the tests1 - or a pipeline did it for you - and some of them failed. Time to fix the tests! But what is it exactly that needs fixing?

There are quite a few things that might make a test fail:

  1. an issue with the build
  2. an issue with the pipeline (if that’s where the test runs)
  3. an issue in the environment the code under test is running on
  4. an issue in the environment the test code is running on
  5. a bug in the code under test
  6. a mistake in the test code
  7. a mistake in what the test should test

Arguably, on the last three describe a test that fails. The test did its job detecting a problem. In the first four we didn’t even get that far. The issues prevented the test from doing its job. So in those cases, it’s not the test(s) as such that need fixing.

Read more…

The difference between a test case and a requirement is the moment of discovery

There are several straightforward ways to distinguish a test case from a requirement. A test case tells you how to check some kind of thing about the application, a requirement tells you that the application should do some kind of thing. A test case is written by a tester, a requirement by a business analyst. A test case takes the shape of an action and an evaluation of the result, a requirement takes the form of a sentence like “product ABC shall do XYZ.”1

A less straightforward, but more interesting way to distinguish a test case and a requirement, is this:

The difference between a test case and a requirement is the moment of discovery.2

In this post I want to explore the meaning of that statement. In the next post I’ll explore how looking at requirements and test cases in this way, can help us to do better testing. So this post will be a bit more philosophical, the next one more practical.

Read more…

Two short checklists for Scrum

checklist no.1

  • Do you add acceptance criteria and story points to each ticket before planning?
  • Do you have daily team meetings where people provide updates on their progress?
  • After each iteration, do you report to stakeholders what work was done and what will be planned next?

checklist no.2

  • Is the team protected during the sprint from stakeholders trying to interfere?
  • Is a sprint focused on achieving a goal and is how that goal is achieved, left sufficiently open?
  • Does the team address impediments as soon as they are discovered?

The difference between a dead and an alive Agile Manifesto

One of my favorite books on leadership is “Extreme Ownership” by Jocko Willink and Leif Babin. I can imagine some people bouncing off of the book because of the Navy SEAL angle, but to be honest I’m a bit of a sucker for the whole military leadership genre.

The second part of “Extreme Ownership” covers four critical leadership concepts, the “Laws of Combat”. Curiously enough, you can map these to the four values in the Agile Manifesto. These four concepts do come in a specific order, so you have to shuffle the Agile values around a little bit:

  • Cover and Move maps to customer collaboration over contract negotiation.
  • Simple maps to working software over comprehensive documentation.
  • Prioritize and Execute maps to responding to change over following a plan.
  • Decentralized Command maps to individuals and interactions over processes and tools.

To me this mapping is interesting in two ways. It sheds a different light on the four Agile values. And it’s an example of how I think we should be engaging with the Agile Manifesto, in a way that keeps it alive.

Read more…

Notes from the March ‘24 FroGS conf open space

Yesterday Elizabeth Zagroba, Huib Schoots, Sanne Visser and I ran another FroGS conf online open space. There were plenty of great sessions, below are some notes from the five sessions I participated in. Thank you to everyone who was there, I had a great time!

If you want to join one of our next FroGS conf events, head over to our site and subscribe to our newsletter.

From notes to shared documentation culture

  • Co-creation works for code. In what ways is co-creation for documentation different?
  • Why do we talk about “an audience” for documentation, instead of about “the contributors”?
  • The purpose of documentation flips from “what we build” to “what we built”.
  • Old research paper on documentation: only documentation with lasting usefulness is architecture and test cases. Everything else is just notes.
  • Documentation heuristic: Is it easier/faster to reverse engineer it instead?

    Read more…

So you want to become a test engineer?

Becoming a test engineer these days is probably harder than it was for me back in 2006. Back then, there was no test automation, we worked in the slow rhythm of waterfall, and for years I was in a team with other testers or at least had a test manager to bounce ideas off. These days, there’s a good chance none of these are true as you start as a test engineer.

While most of these changes are good ones (please don’t take test automation or agile away), it does make me empathize with anyone who starts their career as a test engineer today. The pace is higher and the skill set is broader. More importantly, you need to navigate your career while no one is really sure where to position testers in their organization. That’s not a straightforward environment to start a career in.

So here are four pieces of advice I’d give myself if I’d start my career in testing today:

  • testing can be many different things
  • you’re a software engineer that specializes in testing
  • the end-game is leadership skills
  • shape your career in a way that suits you

Read more…

Tackling test automation in a new language

While there’s value in learning all the ins-and-outs of one particular language, its ecosystem and its testing libraries, I think there’s also a lot of value in having experience in several. Or at least, in two. If you only know one, you don’t really know what’s essential and what’s incidental to the one set of tools you know. You don’t know from experience in what ways things could be different.

Picking up a new language is not trivial though, especially if it’s your second one. There will be a lot to learn. You will notice similarities between the new language and the one(s) you already know. Sometimes those similarities will help you, sometimes they will mislead you.

Also, it’s more than picking up a new language. There are also the testing libraries you will use and the language’s ecosystem (e.g. how to install those libraries1 or how to set up a pre-commit hook with a linter). That’s quite a package.

Read more…