Acceptance Test-Driven Development and Test-Driven Development – How Are They the Same and How Are They Different?

April 5, 2012 — Posted by Ken Pugh

 

Although Acceptance Test-Driven Development (ATDD) and Test-Driven Development (TDD) both have the word "Test" in them. However, the primary purpose of neither is testing. ATDD revolves around the customer, developer, and tester creating detailed examples of requirements in order to understand and clarify the requirements. TDD involves analyzing the requirements of a portion of an implementation. The examples and the analysis evolve into what are commonly called tests, thus the common part of the name. Let's start with a simple example:

Acceptance Test-Driven Development

Suppose there was a user story that went "As a user, I want to convert a temperature in Fahrenheit into Celsius". Suppose a Subject Matter Expert (SME) named Sam gave the formula to Dan, the developer. In order to ensure that Dan understood Sam correctly, Dan might ask Sam for some examples. Sam might give the common examples of:

Temperature Conversion Examples

Fahrenheit

Celsius?

Notes

32

0

Freezing point for water

212

100

Boiling point for water

 

If Tom, someone who has a testing focus, is sitting in on this conversation, he would be looking for other examples. He might ask Sam what should occur if the input temperature is less than absolute zero (-459.69 Fahrenheit). Sam could respond that the system should indicate an error. So there are two more examples

Temperature Conversion Examples

Fahrenheit

Celsius?

Notes

-459.69

-273.16

Absolute zero

-459.70

Error

Below absolute zero

 

Tom or Dan might come up with a few more examples that help clarify the requirements. They ask Sam if the following examples indicate the desired output for some other temperatures.

Temperature Conversion Examples

Fahrenheit

Celsius?

Notes

33

.5555555555555555555

 

31

-.5555555555555555555

 

 

Sam might respond that only two decimal digits should be shown on the output. The values should be rounded up (and down for negative numbers) . So that the output should be as follows:

Temperature Conversion Examples

Fahrenheit

Celsius?

Notes

33

.56

Round up

31

-.56

Round down

 

Now the full set of examples looks like the following. (Some readers may come up with others).

Temperature Conversion Examples

Fahrenheit

Celsius?

Notes

32

0

Freezing point for water

212

100

Boiling point for water

-459.69

-273.16

Absolute zero

-459.70

Error

Below absolute zero

33

.56

Round up

31

-.56

Round down

 

The details of the requirements have been explored by coming up with examples. That is the purpose of ATDD. As a byproduct, the examples can be used as tests against the implementation to ensure that it meets the stated requirements. Thus the examples become acceptance tests.

Acceptance tests do not have to be automated. These six tests could be checked manually. However, another benefit of having acceptance tests is that they can be used as regression tests, in which case they should be automated.

Ideally all acceptance tests for a requirement should be created before implementation commences. That way, the entire requirement is clarified. Pragmatically that is often not possible. However, before being "done", an implementation should pass all the tests that are created at that point. Any acceptance tests that are added after the "done" point actually define new requirement details.

How Acceptance Tests and Unit Tests Stack Up

 

Gerard Meszaros has a diagram about types of tests. The left hand side of the diagram looks like this:

Business Facing

Acceptance Tests - created by customer/developer/tester

Component Tests – created by architect

Unit Tests – created by developer

Technology Facing

Unit tests are created by the developer. The values in them are often derived from the acceptance tests, since they test portions of a system that implement the requirements. But they are usually much finer grained.

Test-Driven Development

In TDD, unit tests are created in concurrent with the implementation. Each unit test indicates a portion of the behavior required by a module. In the temperature example, some of the tests might be named:

TestConvertFahrenheitToCelsiusAtFreezingPoint

TestConvertFahrenheitToCelsiusAtBoilingPoiint

TestConversionThrowsExceptionWhenFahrenheitIsBelowAbsoluteZero

TestConversionRoundsUpWhenCelsiusIsAboveZero

TestConversionRoundsDownWhenCelsiusIsBelowZero

In creating the implementation, the developer Dan starts by creating one of the tests. Dan sees that it fails (since the underlying code has not yet been written), and then writes code to make it pass. Dan then repeats these steps with the next test.

As Dan goes through this cycle, he may recognize that the design of the implementation could be improved. The test are an indication of the quality of the code. Suppose there are many tests against a single module, then the module may be doing too much and therefore would be hard to maintain. In this example, Dan might code a single method called ConvertFahrenheitToCelsius() and put all the logic to check for absolute zero and round off in it. That might make the method more complicated, so he could break it into three methods:

ConvertFahrenheitToCelsius()

CheckFahrenheitForBelowAbsoluteZero()

RoundCelsius()

Now only a few tests are run against each of these methods. The tests have indicated the complexity of the code. Unit tests can suggest other coding issues, such as highly coupled objects that require many auxiliary objects be created in order to execute the tests.

Unit tests are always automated. (There may be some exception out there). So they can easily be run as regression tests to ensure that changes to not break anything.

Testing

Acceptance tests and unit tests are not the only functional tests that should be run against an implementation. Someone with testing focus might come up with additional tests, such as:

Temperature Conversion Tests

Fahrenheit

Celsius?

Notes

-40

-40

 

 

Whether additional tests are necessary depends on the cost-benefit tradeoff of creating and executing additional functional tests.

To be complete, the right hand side of the diagram previously referred to looks like this:

Usability Testing

Exploratory Testing

Property Testing (stress, performance)

 

These tests revolve around the quality of the implementation, rather than the functionality. They also need to be performed to ensure an implementation is of high quality.

 

Summary

Acceptance tests are customer-facing. The ATDD process creates them in order to clarify and detail the requirements. The tests may be run manually or be automated.

Unit tests are developer-facing. The TDD process creates them to detail the behavior of modules. Unit tests are automated. The tests help in evaluating the quality of the code.

 

 

Author: 

Share this:

About the author | Ken Pugh

Ken Pugh is a fellow consultant with Net Objectives (www.netobjectives.com). He helps companies transform into lean-agility through training and coaching. His particular interests are in communication (particularly effectively communicating requirements), delivering business value, and using lean principles to deliver high quality quickly. He also trains, mentors, and testifies on technology topics ranging from object-oriented design to Linux/Unix. He has written several programming books, including the 2006 Jolt Award winner, Prefactoring. His latest books are Lean-Agile Acceptance Test Driven Development: Better Software Through Collaboration. and Essential Skills For The Agile Developer. He has helped clients from London to Boston to Sydney to Beijing to Hyderabad. When not computing, he enjoys snowboarding, windsurfing, biking, and hiking the Appalachian Trail.



        

Free Email Updates!

Sign up for free email updates from
Net Objectives Thoughts Blog

Blog Authors

Al Shalloway
Business, Operations, Process, Sales, Agile Design and Patterns, Personal Development, Agile, Lean, SAFe, Kanban, Kanban Method, Scrum, Scrumban, XP
Cory Foy
Change Management, Innovation Games, Team Agility, Transitioning to Agile
Jim Trott
Business and Strategy Development, Analysis and Design Methods, Change Management, Knowledge Management, Lean Implementation, Team Agility, Transitioning to Agile, Workflow, Technical Writing, Certifications, Coaching, Mentoring, Online Training, Professional Development, Agile, Lean-Agile, SAFe, Kanban
Ken Pugh
Software Design, Design Patterns, Technical Writing, TDD, ATDD, Coaching, Mentoring, Professional Development, Agile, Lean-Agile, SAFe, Scrum
Scott Bain
Analysis and Design Methods, Agile Design and Patterns, Software Design, Design Patterns, Technical Writing, TDD, Coaching, Mentoring, Online Training, Professional Development, Agile