How to Verify the Results of an Agile Sprint - dummies

How to Verify the Results of an Agile Sprint

By Mark C. Layton

The goal of every agile sprint is to produce a usable product, and, as agile was developed as a model for use in software development, most often the usable product is code. Verifying the work done in a sprint has three parts: automated testing, peer review, and product owner review.

Using automated testing to verify agile sprint results

Automated testing means using a computer program to do the majority of your code testing for you. With automated testing, the development team can quickly develop and test code, which is a big benefit for agile projects.

Often, agile project teams code during the day and let the tests run overnight. In the morning, the project team can review the bug report the testing program generated, report on any problems during the daily scrum, and correct those issues immediately during the day.

Automated testing can include

  • Unit testing: Testing source code in its smallest parts — the component level.

  • System testing: Testing the code with the rest of the system.

  • Static testing: Static testing verifies that the product’s code meets standards based on rules and best practices that the development team has agreed upon.

Using peer review to verify agile sprint results

Peer review simply means that development team members review one another’s code. If Samuel writes program A and Joan writes program B, Samuel reviews Joan’s code, and vice versa. Objective peer review helps ensure code quality.

The development team can conduct peer reviews during development. Collocation helps make this easy — you can turn to the person next to you and ask him or her to take a quick look at your work. The development team can also set aside time during the day specifically for reviewing code. Self-managing teams should decide what works best for their specific team.

Using product owner review to verify agile sprint results

When a user story has been developed and tested, the product owner then reviews the functionality and verifies that it meets the goals of the user story. The product owner verifies user stories throughout each day.

Finally, the product owner should run through some checks to verify that the user story in question meets the definition of done. When a user story meets the definition of done, the product owner updates the task board by moving the user story from the Accept column to the Done column.