BobLambie.com

Top Left Adv

Top Right Adv

The KPIs of Agile:

CALL out what is done: • Definition of Done and conformance to it – How strict is the “Definition of Done” of the team? What is the trend, does it increase or stabilize or even decrease? As a guide, I recommend crafting a Definition of Done according to this blog post by Jeff Sutherland.

  1. Technical Debt Management – the known problems and issues delivered at the end of the sprint. It is usually measured by the number of bugs, but may also include deliverables such as training material, user documentation and delivery media.
  2. Team Velocity – the consistency of the team’s estimates from sprint to sprint. Calculate by comparing story points completed in the current sprint with points completed in the previous sprint; aim for +/- 10 percent.
  3. The predictability (velocity average deviation) of each team:This metric shows how stable or steady your team’s velocity is. Remember that two teams can have a velocity of 20 points, and one is constantly delivering 20 points per sprint while the other delivers 10 points in some sprints and 30 points in others.
  4. The user stories’ average lead time (in days or hours) for each team. How long it takes for a user story to move from “in progress” to final status (usually “accepted”) on your team’s physical or virtual board.
  5. Quality Delivered to Customers– Are we building the product the customer needs? Does every sprint provide value to customers and become a potentially releasable piece of the product? It’s not necessarily a product ready to release but rather a work in progress, designed to solicit customer comments, opinions and suggestions. This can best be measured by surveying the customers and stakeholders. Do Defects go up as velocity increases?
  6. Team Enthusiasm – a major component for a successful scrum team. If teammates aren’t enthusiastic, no process or methodology will help. Measuring enthusiasm can be done by observing various sprint meetings or, the most straightforward approach, simply asking team members “Do you feel happy?” and “How motivated do you feel?”
  7. Retrospective Process Improvement– the scrum team’s ability to revise its development process to make it more effective and enjoyable for the next sprint. This can be measured using the count of retrospective items identified, the retrospective items the team committed to addressing and the items resolved by the end of the sprint.
  8. Risk by finishing late (do stories close or increase burndown at the end of the sprint)
  9. Communication – how well the team, product owner, scrum master, customers and stakeholders are conducting open and honest communications. Through observing and listening you will get indications and clues about how well everyone is communicating.
  10. Team’s Adherence to Scrum Rules and Engineering Practices– Although scrum doesn’t prescribe engineering practices—unlike XP—most companies define several of their own for their projects. You want to ensure that the scrum team follows the rules your company defines. This can be measured by counting the infractions that occur during each sprint.
  11. Team’s Understanding of Sprint Scope and Goal– a subjective measure of how well the customer, product team and development team understand and focus on the sprint stories and goal. The goal is usually aligned with the intended customer value to be delivered and is defined in the acceptance criteria of the stories. This is best determined through day-to-day contact and interaction with the team and customer feedback.
  12. Story cycle time. Average number of days user stories were in a “committed to done” state on a sprint-by-sprint basis. Benefit: Shows the team’s capacity for work on a sprint-by-sprint basis. Benefit: Shows how quickly the team completes user stories.
  13. Estimation accuracy: Aggregating story cycle time into story point buckets based on the completed user story’s estimate shown over time (monthly). Benefit: Shows the reality of how long it takes to complete a user story versus what’s estimated.
  14. Defects per story point: Shows a ratio of defects to velocity per sprint. Benefit: Shows team’s average quality per story point. Maybe helps determine risk.
  15. Test cases run: Showing the number of test cases executed per sprint, broken down by test type (manual/automated). Can be shown against the sprint’s total number of defects to show the test cases’ effectiveness. Benefit: Shows how much testing is being performed during each sprint.

Bottom Left Adv

Bottom Right Adv