Anyone familiar with the scrum methodology should be familiar with a good Definition of Done (DoD), but there is often an opinion amongst developers that their jobs would be a lot simpler if only all those other people did theirs well. It isn't uncommon for that to be the perception in any situation when there are tribes of people involved, but I'll leave that for the sociologists to debate.
This article's focus is on an equivalent to the DoD prior to the development sprint beginning.
Definition of Done
The Definition of Done is a commonly understood checklist of conditions to consider a story finished, from a Development perspective. The same list is applicable to any story. It does not imply that the story has reached the Production environment and is live, only that it is expected to be, subject to further higher-environment testing.
A story must accomplish the below items to be considered Done:
- The story was understood by the affected teams;
- Unit tests were written, completed and executed successfully;
- All coding activities are complete;
- All analytics changes or additions were included;
- All acceptance criteria were met;
- Zero code smells exist;
- Continuous Integration test execution revealed no errors;
- A peer review of a pull request revealed no issues;
- Functional tests were written and passed without error;
- Non-functional requirements were met;
- OWASP checking revealed no issues;
- Any necessary mid-sprint design changes were included;
- The relevant feature branch was closed;
- The feature was included in a release package; and
- The Product Owner accepted the user story.
Of course, some of the above may be different in your organisation - I tried to present a typical set for the web and mobile developments I've run.
Definition of Ready
A good Definition of Ready would be:
- Everyone involved understands what the story is and why it is needed;
- The story was written as a user story;
- Acceptance criteria exist;
- Behaviour Driven Development scenarios exist that reflect the acceptance criteria;
- Where there are any UI elements included in the story, designs are provided;
- Designs of all related architectural elements are complete;
- The team understands how to demonstrate the feature; and
- The story was estimated by the team.
Let's address each element in turn.
Everyone involved understands what the story is and why it is needed
One of the main points of story definition is to define a feature or component in a clear, unambiguous way. Whilst Agile's short iteration cycle reduces the impact of the awkward "that isn't what I wanted" delivery, there is still the capability to spend two weeks working on the wrong thing if the definition is unclear.In my projects we have a number of review points prior to a sprint beginning that try to reduce this risk, including three amigos, feature briefings and look-ahead meetings. The nature and scope of these meetings is dependent on the circumstances and the complexity of the work, but I would advise at least including the three amigos meeting, so that there is sufficient scrutiny of a story to avoid ambiguity as much as is practical. Note that the "three" can sometimes be more, if the team includes a range of delivery platforms (e.g. for web and mobile).
Include the story's context. It should be atomic, yes, but it doesn't exist in a vacuum. Where does it sit? Of what does it form part? What does it enable? What does it rely on?
The story was written as a user story
While it is tempting to skip the "benefits" part of the "As a <user> I want <feature> so that <benefit(s)>" pattern, it is that part that justifies its inclusion.Always address the justification for every story. If you can't, question whether the story is valuable. Justification should include the scope, the user base impact, the demand for it and the financial implications of doing it and not doing it. The latter is critical. I'm sure you have your own examples of a determined Product Manager pushing a story that they think will be of benefit, without doing their homework to prove it. Development time costs money, so spend it wisely.
Acceptance criteria exist
Business analysts or product owners who come from a waterfall background are used to writing extensive functional specifications. Stories are much more atomic than the long documents of old, but should include acceptance criteria if relevant. Whether these are formal or simply in note form to supplement the story is determined by the needs of the project, but they must be light. They do not replace the story; they are not the "description" of the story (i.e. what the badly-worded story title "really meant"); they are not the "part that you really need to read". If any of those are true, rewrite your story title.I prefer to use acceptance criteria in note form to annotate the story. I use real world examples where I can, to aid understanding. You shouldn't need to go into a lot of detail if your BDD section is extensive.
Behaviour Driven Development scenarios exist
Behaviour driven development (BDD) scenarios are effectively real-world test cases designed to detail and prove the acceptance criteria.
REF <incremental reference number within story>
TITLE <Why do we need this case? What are we testing?>
GIVEN <pre-condition 1 exists>
[AND <pre-condition x exists>]
WHEN <action 1 happens>
[AND <action y happens>]
THEN <result 1 must happen>
[AND <result z must happen>]
Always include both positive and negative scenarios (what should happen and what error messages appear when it doesn't). In addition, consider including examples to illustrate more complex cases. That will give your developers and testers the best chance of meeting the intended acceptance criteria.
If you're constrained for analysis resource or time, consider having the testers write the BDD scenarios, but always make sure that the product owners or business analysts check and confirm their definition before proceeding. That also confirms that the testers have understood the story correctly.
Don't be afraid to add more scenarios later as people identify them, but try to be as comprehensive as possible in the initial definition. These cases will help the team to estimate the work and long lists of BDD scenarios suggest that the story is too complex and needs to be split into smaller stories.
Where there are any UI elements included in the story, designs are provided
I'm not suggesting that every UI component needs to signed off before any development work can be started. If the project was started well, there is already a component pattern library or design guide for the various types of UI element (and if there isn't, create one now!). UX testing of those elements should be done early in the project, to check that they work in practice.Beyond that point, individual usage should be little more than the type of element and its configuration for that specific case. For example, yes, it's another drop down list, but what are the values within it? How are they sorted? Where does it appear on the page? Does it have any unusual properties (e.g. it only appears if the user selected value 1 from the previous field). In particular, complex UI designs must be included, to avoid ambiguity.
Some of you will argue that UI elements can be determined as the sprint progresses, and you're right, but if these are known up front why not include them in the story immediately? Many people think visually and having an image to see and discuss is a powerful way to get the meaning of a story across and assist with estimation.
You don't need to go into too much detail to get a feel for the UX. I've worked with customers who use a quick paper sketch and others who expect a fully rendered final look and feel. If possible, go for the former. Information over art.
Designs of all related architectural elements are complete
The stage your development has reached will determined the volume of work to be done here, but the developers will need to know on what architecture the story solution needs to sit. For established products, this will be stable, but if the story forms the basis for a new feature it may still need architectural elaboration. For brand new products the architectural design is a phase in itself and should be sized separately.Even projects that use a "create the architecture as we go along" approach need some sort of principles to be established early on so that the team knows the frame in which they can operate. For example, what tech stack? When and how to create a new service? What are the non-functional requirements? etc.
The team understands how to demonstrate the feature
At the end of a development sprint, there should be a demo to show that the work meets the specification. That is intuitive when there is a visual component, but how will the team demo a story that has no visual element (for example a system to system interface)? For the latter case I'd use a test harness, with a lot of verbal explanation.Even when there is a visual element, what examples will be used? Who will demo the work? How comprehensive should the demo be? These questions should be discussed with the product owner beforehand, so that the necessary planning work can be included in the sprint. The demo part of the sprint is considered as an afterthought too often.
The story was estimated by the team
Armed with all the information described above, the team should be able to estimate the story. The method and unit of estimation should be determined by each team individually and consistently throughout your project. Having used story points with a lot of good intentions for a number of years, I'd favour time-based estimation in hours. Story points work well in theory, but as a concept they are hard to grasp and relative sizing is implicitly pointless (pardon the pun) when most people regard points in terms of how much time something will take to do anyway.
Of course, story points represent much more than just the working hours required to fulfil a task, but using an hour-based time estimate approach incorporates a lot of the same implicit elements. Avoid simply having the person with the lowest estimate do the work, however. Always think in terms of the team as a whole.
One issue I've found with story points is the use of the Fibonacci series for estimation. "A little bit bigger than an 8" becomes a 13 - a 62% potential increase - when the work might have only increased by an hour. Hourly estimation avoids this inflation.
If you're worried about estimating before all the facts are known, either push back or estimate in coarser units. An example might be to use days for the three amigos initial "finger in the air" but hours in the sprint planning meeting.
You'll need to find a balance and I find that using at least the ideals of the above can be done quickly and effectively. One quick method is to create and use a pro-forma template. If all the sections contain at least some content they enable discussion towards a common understanding. As with any approach, the more you put in; the more you'll get out, but that needs to be balanced with time pressures and a form works well to set expectations and remind the user of the areas they need to consider. Overall, though, keep it light.
Of course, story points represent much more than just the working hours required to fulfil a task, but using an hour-based time estimate approach incorporates a lot of the same implicit elements. Avoid simply having the person with the lowest estimate do the work, however. Always think in terms of the team as a whole.
One issue I've found with story points is the use of the Fibonacci series for estimation. "A little bit bigger than an 8" becomes a 13 - a 62% potential increase - when the work might have only increased by an hour. Hourly estimation avoids this inflation.
If you're worried about estimating before all the facts are known, either push back or estimate in coarser units. An example might be to use days for the three amigos initial "finger in the air" but hours in the sprint planning meeting.
I don't have time for all that!
Stable product teams that have worked together for years gain an intuitive ability to understand nuances expressed by team members. If your project does not have those elements or if the team changes from time to time, you'll struggle to produce good quality output quickly unless it is clear to the team what it is they are working towards. While it might be optimistic to think that people will either "just get it" or that "failing fast" is a good thing, years of analysis practice suggest that it is more complex than that and any time spent thinking before doing is valuable, if only to reduce the chance of failing at all.You'll need to find a balance and I find that using at least the ideals of the above can be done quickly and effectively. One quick method is to create and use a pro-forma template. If all the sections contain at least some content they enable discussion towards a common understanding. As with any approach, the more you put in; the more you'll get out, but that needs to be balanced with time pressures and a form works well to set expectations and remind the user of the areas they need to consider. Overall, though, keep it light.





