Application Services provides a platform for IT Development and Maintenance professionals to discuss and gain insights into best practices, process innovations and emerging technologies that will shape the future of this profession.

« Agile Vision | Main | Automobiles, industrialization and software »

Agile Metrics - Running Tested Features

In my previous blog, I had spoken about the need for organizations to have an Agile Vision. In this blog, I will talk about the importance of metrics. Metrics - Different things to different people,organizations and cultures. However, the underlying focus of measurement is to verify the existence of completely working software that serves its intended purpose. Metrics determination adopts the same empirical approach of Agile by demonstration at the end of every iteration and PSI.

Metrics are powerful tools that help plan, inspect, adapt, and understand the progress of an organization over time. There are several metrics that contribute to the success rate of agile projects. I will not discuss each one of them but will certainly help identify the one metric that will not only give a detailed picture of project health, but will also encourage higher productivity within the team. This metric is none other than Running Tested Features (RTF).

Running Tested Features

In layman's words it means how many high risk & high business value working features were delivered for deployment which works out to software that works and that has the maximum number of features for every dollar invested. Let's do a small comparison - Waterfall versus Agile. 

In projects running traditional waterfall, RTF value would be zero for the first several months due to planning and analysis. This would be followed by work on infrastructure, framework and architecture. RFT would still be zero. Agile projects, however, are time-boxed to, say 2 weeks.  The output at the end of 2 weeks is a set of one or more RTFs, in descending order of customer needs. Analysis, design, development, testing & documentation (if necessary) are activities performed during the iteration.

While both projects might finish at the same time, the agile project will have delivered more value much earlier than the waterfall project. Additionally, the agile project will have identified and mitigated risk to the project much earlier on in the cycle, thereby keeping the technical debt at manageable level.

On productivity terms, measuring RTF is a quick way to analyze the state of the team. Good agile teams should have the ability to consistently deliver stories over a period of time while taking into account unexpected challenges or risks

So, what is RTF? 
  • The desired software is broken down into features/stories 
  • Each feature/story has one or more automated acceptance tests
  • When the tests work, the feature/story is implemented as desired
  • Measure at every moment in the project, how many features/stories pass all their acceptance tests 
  • Measure how many customer-defined features/stories are known to be working? 
Components

It is not easy to measure RTF. There are many smaller components that impact RTF - defects, tests, cycle times, code coverage, etc. There are some components whose numbers need to go up for success and there are some that need to go down for success. 
Metrics.jpg
Running Tested Features should start showing linear growth over the complete lifecycle of the project. The team needs to deliver and measure RTF every week. 

Running: Features are packed into one integrated product 
Tested: Features are continuously passing all tests
Features: We mean real end-user features that are built based on customer inputs, not technical features 

RTF Curve
How should the RTF curve look like?
RTF 1.jpg
RTF Growth- Graph 1 shows a linear growth for running Tested Features in the project. The x-axis represents the days of the sprint. The y-axis represents the features completed and ready for deployment. The graph shows the project to be in a healthy state. 
RTF 2.jpg
RTF Growth - Graph 2 shows a project with unhealthy signs and is most likely in trouble. Early on, there's a sudden drop in RTF. This is the first indication that something has changed that broke the tests. Or requirements have changed. The tests have changed to reflect the new requirements which become a challenge for the software building process. Then, later in the project, there's another dip in the RTF. Same cause. 

RTF Growth - Graph 3 is typical of a project following traditional waterfall process. There is no growth for first several months.
RTF 3.jpg
Ensuring steady growth of RTF 

In the words of Ron Jeffries, "To keep a single metric (RTF) looking good demands that a team become both agile and productive"

Factors that contribute to linear RTF growth include the following: 
  • Test Automation
  • Build Automation and Frequent Builds
  • Acceptance Test Automation
  • Avoid monkey clicking 
  • Enforce Continuous Integration
  • Run tests on every commit to source code management system
Some additional pointers that must be kept in mind:
  • Run acceptance tests on implemented features immediately
  • Developers must write the unit tests
  • Feature level tests are written by the Testers
  • Testers may help developers evaluate their Unit Test coverage
  • Developers may help testers write the difficult Feature Tests 
  • Metrics should not be used to compare teams or individuals

Conclusion
For organizations wanting to measure progress of their agile projects accurately on a regular basis, there cannot be anything more comprehensive than the Running Tested Features (RTF) metric. This will not only help in measuring project success but also result in creating self-contained teams fully capable of organizing themselves for higher productivity. 

Post a comment

(If you haven't left a comment here before, you may need to be approved by the site owner before your comment will appear. Until then, it won't appear on the entry. Thanks for waiting.)

Please key in the two words you see in the box to validate your identity as an authentic user and reduce spam.