The Bonanza Syndrome

May 20, 2011 — Posted by Scott Bain

Years ago there was a very popular television program called "Bonanza". It was about a father and his four sons living on a ranch in the Lake Tahoe, Nevada area in the 1860's. It ran on NBC from September 12, 1959 to January 16, 1973; in other words 14 years, and thus it was the second longest-running western (just after "Gunsmoke"). Everyone I knew watched it and knew all about the characters and plots.

But no one asked a very simple question: Why is this show called "Bonanza"?

The family was called Cartwright, they lived on the Ponderosa, the nearest town was Virginia City… there is no "Bonanza" anywhere. When I brought this up a couple of years ago a friend tried to explain that there was probably a gold or silver rush in the area at that time, a real bonanza for the area. But the show was not about mining, the characters were ranchers, not miners, and nobody ever mentioned a gold or silver rush even once.

It makes no sense. The point I am focused on, however, is not just that it makes no sense but that the show played for 14 years and nobody I know of ever questioned its name. Not once. I finally asked it 30 years after it was canceled.

Another example: I've been hearing this song on the radio and in advertising and so forth for a long, long time. I'm not fond of it, but I cannot help but be very familiar with its lyrics. It's the Scorpions' "Rock you like a hurricane."

You know it, it's the one that goes "Here I am/Rock you like a hurricane" over and over.  And over.

For years (this song came out in 1984) I heard this without once noting… rock you like a hurricane? Hurricanes blow, they don't rock. That's like saying "I'm as hungry as a lemon", or "my head is spinning like a horse", or "he arrived as quickly as a bar of soap."

It makes no sense. But I heard this probably 100 times (or more, unfortunately) before it even occurred to me that it makes no sense.

I do not think I am unusual in this sense. I think people tend to accept what they hear without much critical thought. Most of the time this is probably not all that important, but it really can be when we accept requirements for a system.

For example, in our Sustainable Test-Driven Development class we conduct an exercise where students develop a device driver for a paint-mixing system, and there are a number of business rules they have to write tests about. We roll these rules out, in an agile fashion, as development stories. They write the tests and then write the code to satisfy them; a very typical test-first approach to TDD.

One requirement that comes along is this: There are currently two kinds of paint in this system, glossy paint and flat paint. But there is a very important rule about mixing, namely that you cannot mix glossy paint and flat paint together. The customer warns that doing so would not work, chemically, that the result would be a gloppy mess of useless goo.

Sounds reasonable. Until you try to write a test for it.

First of all, what does it mean to say "you can't"? Of course I can. Watch! (imagine Scott pouring two cans of paint into a mixer). See? The customer means "you mustn't", rather than "you can't".

And, of course, the real question is… what should the system do if someone tries? This is missing. The requirement should be "if someone tries to mix gloss and flat, do the following." It could be shutting the system down, throwing an exception; you could probably imagine other possible actions. But the point is that this is a customer decision, and therefore is a missing requirement.

But even if we get the answer "what do to if", the requirement is still not clear.

You must not mix gloss and flat. Does that mean precisely what it says? Or does it really mean "the two cans of paint you mix must have the same finish"? Those may be equivalent requirements now, but they will not be when a third finish, say semi-gloss or matte is added to the system. Will the rule still apply to them, or is this a special case only involving gloss and flat?

Not only would this influence how we tested the system, but how also we implemented the rule to pass the test:

if (finish1 != finish2) throw new BadPaintException();

vs

if ((finish1==Finish.Gloss && finish2==Finish.Flat) ||

   (finish2==Finish.Gloss && finish1==Finish.Flat))

    throw new BadPaintException();

Again, not at all the same thing, but the requirement does not make it clear which is right… and the maintenance path for the code would be quite different if we choose the wrong one.

So I call this "The Bonanza Syndrome". Not that customers give us requirements in this vague and misleading state, but that we accept them as reasonable and complete when they are not. One of the powerful things about techniques like Test-Driven Development (both unit testing and acceptance testing ) and Commonality-Variability Analysis is that they force us to apply critical thinking to requirements, and probe them for their actual meaning.

Which, of course, is one reason we teach them. Smile

-Scott Bain-

Author: 

Share this:

About the author | Scott Bain

Scott Bain is an consultant, trainer, and author who specializes in Test-Driven Development, Design Patterns, and Emergent Design.


Comments

Hear, hear.

I would however suggest that in the example you gave, the unit tests would look differenet to represent our different understanding of the requirment.

{ // Mustn't mix paint if the finish styles are not the same   Paint paint = Any.OneOf<Paint.FinishStyle>();  Paint differentFinishPaint = Any.OneOfExcept<Paint.FinishStyle, paint1>();  AssertExceptionThrown(Paint.Mix(paint, differentFinishPaint));}

vs.

{ // Mustn't mix Flat and Gloss  Paint glossPaint = Paint.GetInstance(FinishStyle.Gloss);  Paint flatPaint = Paint.GetInstance(FinishStyle.Flat);   AssertExceptionThrown(Paint.Mix(glossPaint, flatPaint));} 

vs.

{ // Mustn't mix Gloss with anything else  Paint glossPaint = Paint.GetInstance(FinishStyle.Gloss);  Paint nonGlossPaint = Any.OneOfExcept<Paint.FinishStyle>(glossPaint);  AssertExceptionThrown(Paint.Mix(glossPaint, nonGlossPaint));} 

I would also say the Bonanza Syndrom applies even further in our professional lives.

  Why do we call methods 'methods'? Methods of what?

  Why do we call Test Driven Development 'Test Driven Development'? It's not really about testing.

I can give more examples, but I have to get to AgilePalooza and talk about Short User Stories, and I gotta get there quicker than a bar of soap!

Free Email Updates!

Sign up for free email updates from
Net Objectives Thoughts Blog

Blog Authors

Al Shalloway
Business, Operations, Process, Sales, Agile Design and Patterns, Personal Development, Agile, Lean-Agile, Kanban, Scrum, Scrumban, XP
Cory Foy
Change Management, Innovation Games, Team Agility, Transitioning to Agile
Jim Trott
Business and Strategy Development, Analysis and Design Methods, Change Management, Knowledge Management, Lean Implementation, Team Agility, Transitioning to Agile, Workflow, Technical Writing, Certifications, Coaching, Mentoring, Online Training, Professional Development, Agile, Lean-Agile, Kanban
Ken Pugh
Software Design, Design Patterns, Technical Writing, TDD, ATDD, Coaching, Mentoring, Professional Development, Agile, Lean-Agile, Scrum
Scott Bain
Analysis and Design Methods, Agile Design and Patterns, Software Design, Design Patterns, Technical Writing, TDD, Coaching, Mentoring, Online Training, Professional Development, Agile