• ALL
  • CATEGORIES

The Definition of Done

Scrum doesn’t officially recognise different roles within a team; in theory everyone’s working together to complete each story. In reality though, most teams I work with do assign particular roles: developer, tester, UX etc. With different members doing work that is ‘finished’ at different times, how can the team, and the Product Owner, know when a story has truly been completed?

The answer is in two parts: the acceptance criteria for the individual story and the team’s Definition of Done.

For a Scrum team the aim of each sprint is to produce a potentially releasable increment of the product. So it’s important to know at the end of the sprint which features can actually be included in a release and which can’t. A shared understanding of which criteria a feature must satisfy to be releasable is essential if the team is going to work together towards a sprint goal effectively.

How to come up with a definition of done

‘That’s all very well’ you say, ‘but how are we supposed to come up with a single definition of done when our user stories are all so different?’

Which is a fair question. Going back to our criteria of what makes a good user story should give us some clues. A good user story, you’ll remember, is Independent, Negotiable, Valuable, Estimable, Small and Testable.

And those properties hint at two key properties which should be embodied by ‘Done’ stories – they have to provide value and they have to meet a certain quality standard.

While the value that a particular product feature provides is more a call for the Product Owner to make, quality can certainly be enhanced by applying certain standards across all stories and making sure they meet these standards before they’re considered ‘Done’.

Let’s take the following as an example of a set of criteria for the Definition of Done:

  1. Code is peer-reviewed
  2. Code is deployed to test environment
  3. Feature is tested against acceptance criteria
  4. Feature passes regression testing
  5. Feature passes smoke test
  6. Feature is documented
  7. Feature ok-ed by UX designer
  8. Feature ok-ed by Product Owner

 

Each of these criteria is about ensuring quality in the development process and minimising the amount of work the team has to do going back and fixing things they didn’t get right the first time round.

The value part of the equation is really encapsulated by 3. and 8. The acceptance criteria will vary for every story and, if well written, will ensure that the story delivers value. The Product Owner, acting as the value gatekeeper for the team has the final say on whether a feature is sufficiently valuable to be considered ‘Done’.

As Mike Cohn says, you can think of the Definition of Done as an extra set of acceptance criteria that are rubber stamped onto each and every user story.

The evolving Definition of Done

A team’s Definition of Done won’t remain the same throughout the lifetime of the project and neither should it. As a team becomes more effective and productive, as they learn to work better together, they will naturally enhance and refine their definition of done to produce more valuable and better quality features. They will recognise patterns in the processes and procedures that are required to produce high quality features and start adding these to the DoD.

It’s therefore important that the team gets regular opportunities to revisit the definition of done and the natural place to do this is in the sprint retrospective meeting (facilitated by the ScrumMaster).

Scrum Guide mature Definition of Done

The aim should be to stretch done – so eventually you’re confidently releasing all the way to production within your DoD.

Potential DoD sticking points

A Definition of Done that no-one knows about is next to useless. It should be easily referred to by all members and so I’d recommend placing it on or near the team’s task board.

As noted above, the DoD should also be regularly reviewed and discussed. If a team that works well together isn’t getting a lot of stories ‘Done’ in their sprints it is could be due to poor story writing, external dependencies, overly large stories or it could also be due to a defective definition of done.

I’ve seen definitions of done that include ‘Sign off’ steps from stakeholders. This shouldn’t happen. The DoD is primarily about product quality, not approval from external parties.

Stories that do not meet the DoD should not be shown in the sprint demo meeting. This helps reinforce the team’s commitment to getting stories done.

On a project where there are multiple Scrum teams working on different features for a single product or platform you may want a shared definition of done, since they’re all working to get features into the same release. However it’s important all of the teams sign up to the DoD if you take this approach.

The teams need to work out amongst themselves how to refine and develop the definition. This leads to greater consistency but it can slow down new teams that are set up within an organisation. It’s important to revisit the DoD as teams change so that everyone understands and agrees with it.

In the next post I’ll talk about another useful tool for quality control, the Definition of Ready.

Leave a reply

You can use these tags:

<a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

  1. Mike says:

    You state “I’ve seen definitions of done that include ‘Sign off’ steps from stakeholders. This shouldn’t happen.”

    Yet a little later include these steps in your sampel DoD:

    – Feature ok-ed by UX designer
    – Feature ok-ed by Product Owner

    Do these not conflict with one another?

    • Jim Bowes says:

      Hey Mike, It’s a good point. I wasn’t seeing the UXer as a far off stakeholder in this example. I was referring to times I’ve worked with teams where senior stakeholders quite far from the project and process are causing gridlock with stories not becoming ready or done. So I think it’s the nature of the OK – it depends on the circumstance and the reason for the review. If there’s a UXer in the team then they can contribute to the user story throughout the sprint it might just be a final check (perhaps against a global UX framework). For the tech team you might have that ‘a pull request has been reviewed’ for example. Ultimately the POs view is the source of the truth for the team – if the story meets the agreed acceptance criteria but it’s not quite what they meant this either needs to be resolved through in-sprint conversation or by adding a new story to the backlog. Pragmatically though we know in an organisational context there may be brand guidelines/UX guidelines so generally teams and the PO need to work together to find a practical way of doing this without ruining the cadence of the project.

    • Tim says:

      I agree with Mike
      UX and PO Ok – shouldnt be part of DoD.
      Thats just sign off by another name, What ever the PO or UX designer has to say about the story should be documented in either the story or the DoD from the off set

  2. ING Tech IT says:

    […] La HU se podrá terminar a tiempo en el sprint actual, siempre cumpliendo nuestro Definition of Done […]

  3. […] well maintained kanban board (wip limits, definition of done) with good team rituals (daily standups, show and tells, retrospectives) helps eliminate the […]

  4. […] will you know when you are DONE on implementing the requirements […]

  5. eurogold says:

    excellent points altogether, you just gained a emblem new reader.

    What may you recommend in regards to your publish
    that you made a few days in the past? Any sure?

  6. […] there are a few complications in our definitions. There is a difference between a task that is done or finished and delivered and actually being used. A task that has been delivered but doesn’t work/function […]

Sign up for the Manifesto newsletter and exclusive event invites