PM Methodologies

Definition of Done: Setting Quality Standards for Agile Teams

By Vact Published · Updated

The Definition of Done (DoD) is a shared agreement within a Scrum team that specifies the quality criteria an increment must meet before it is considered complete. Without a clear DoD, “done” becomes subjective — one developer’s “done” means coded and unit tested, while another’s means coded but untested. This ambiguity creates technical debt, surprises during sprint reviews, and unpredictable release quality.

Definition of Done: Setting Quality Standards for Agile Teams

Why the Definition of Done Matters

The DoD serves multiple purposes in Scrum:

Transparency. Everyone, including stakeholders, knows exactly what “done” means. This eliminates the 90%-done phenomenon where work appears complete but requires significant additional effort.

Quality consistency. Every increment meets the same standard regardless of which team member worked on it. This consistency enables predictable releases and sustainable velocity.

Velocity accuracy. Story points only count when items meet the DoD. If the team counts partially-complete work as done, velocity becomes an unreliable metric and sprint planning commitments lose meaning.

Technical debt prevention. A rigorous DoD that includes testing, documentation, and code review prevents the accumulation of shortcuts that slow future development.

Creating a Definition of Done

Step 1: Gather Input

The DoD should be created collaboratively by the development team, Product Owner, and Scrum Master. Each perspective brings different concerns:

  • Developers focus on code quality, testing, and technical standards
  • The Product Owner focuses on acceptance criteria and user experience
  • The Scrum Master focuses on process compliance and sustainability

Step 2: Define Criteria

A typical DoD for a software team includes:

Code Standards

  • Code follows team style guide and passes linting
  • Code has been peer reviewed by at least one other developer
  • No known critical or high-severity bugs remain

Testing

  • Unit tests written and passing with minimum 80% coverage for new code
  • Integration tests written for API changes
  • Acceptance criteria verified manually or with automated tests
  • No regressions in existing test suite

Documentation

  • API documentation updated for public-facing changes
  • README or internal docs updated if setup steps change
  • Release notes drafted for user-facing changes

Deployment

  • Code merged to the main branch
  • Feature deployed to staging environment
  • Smoke tests passing on staging

Step 3: Validate Feasibility

Every criterion in the DoD must be achievable within a single sprint. If the team cannot meet the DoD and deliver work, either the DoD is too ambitious or the stories are too large. Adjust iteratively — start with a DoD the team can consistently meet and raise the bar as practices mature.

DoD vs. Acceptance Criteria

These are related but different concepts:

AspectDefinition of DoneAcceptance Criteria
ScopeApplies to all work itemsSpecific to one story
Who definesDevelopment teamProduct Owner
FocusQuality and processFunctionality and behavior
ChangesEvolves slowlyDifferent per story

Acceptance criteria define what a story does. The DoD defines how it is delivered. A story is complete when it meets both its acceptance criteria and the team’s DoD.

Evolving the Definition of Done

The DoD should evolve over time as the team matures. Common evolution patterns:

Initial DoD (new team):

  • Code reviewed
  • Unit tests passing
  • Acceptance criteria met
  • Deployed to staging

Mature DoD (established team):

  • Code reviewed by two developers
  • Unit and integration tests passing
  • Performance benchmarks met
  • Security scan clean
  • Accessibility standards met
  • Deployed to production with feature flag
  • Monitoring and alerts configured

Use retrospectives to discuss whether the DoD should be updated. If recurring issues suggest a gap in quality standards, add a criterion. If a criterion is consistently met without effort, it may no longer need explicit inclusion.

Common Pitfalls

No DoD at all. Teams without a written DoD have an implicit one, and it usually means “works on my machine.” Write it down, post it visibly, and reference it in sprint reviews.

DoD too lenient. A DoD that only requires “code compiled” sets a low bar that accumulates technical debt. The DoD should represent truly shippable quality.

DoD too strict. A DoD that requires full documentation, 100% test coverage, and production deployment for every story may be aspirational but impractical. If the team consistently cannot meet the DoD, they either need more capacity or a more realistic DoD.

Inconsistent enforcement. The DoD only works if it is applied consistently. If the team makes exceptions under deadline pressure, the DoD loses credibility. When deadlines are tight, the correct response is to reduce scope, not reduce quality.

Different DoDs for different stories. Some teams create a lighter DoD for bug fixes or technical tasks. This creates confusion about what “done” means and incentivizes categorizing work to avoid rigorous standards. One team, one DoD.

Making the DoD Visible

Post the DoD where the team can see it daily — on the physical Kanban board, in the project management tool, or on the team’s wiki. Reference it during sprint reviews when demonstrating completed work. When a stakeholder asks “is this done?”, the team should be able to point to the DoD and verify each criterion.