Home > Community> The Lean Post> Problem? What Problem?

Problem? What Problem?

by Brent Wahba
February 16, 2016

Problem? What Problem?

by Brent Wahba
February 16, 2016 | Comments (3)


-  Former CEO who probably wants to remain anonymous

Every February, the CEO of a corporation I once knew sent out a similar voicemail message to the whole organization. His point was clear: spend less! Sure, he got those results, but unfortunately, he also created many more unintended consequences. For example, Engineering cut back their development work including validation and pre-launch plant visits, and as a result, products were just not ready for production on time. Fixing those start-up quality issues then required more travel, more testing, and more expediting of parts. All together, these problems (as you’ve probably guessed) killed mid-year budget performance and resulted in another scathing voicemail with even more draconian cuts. 

Towards the end of the year, however, departments were actually in danger of underspending their budgets (and thereby losing them the following year), so everyone scrambled to spend as much as they could. But due to the nature of invoicing, many bills arrived and were paid the following January – hence the poor first month performance. Worse yet, customers felt ignored (“How come nobody ever visits us?”) and wondered why a company who promoted their adoption of Lean so much just couldn’t get their act together. The company eventually went bankrupt.  

A budgeting process that didn’t align with the realities of the business cycle was just one of their many problems. If you started asking organizational performance “Whys?”, however, you would see that their leaders were actually horrible problem solvers – despite all their training in Lean, Six Sigma, and other techniques. Having spent three decades in formalized problem solving myself, I have come to realize that there is an underlying pattern to successful problem resolution that isn’t as blatantly obvious as it needs to be in many methodologies. That pattern is treating each problem-solving step and its related assumptions as their own hypotheses.

When we use the scientific method to learn about or fix problems, we create and prove a hypothesis or plausible relationship. Ice cream sales and murder rates are correlated, maybe our hypothesis is that sugar highs cause overly aggressive behavior. If we dig a little deeper, though, we find the erroneous “correlation = causality” trap and re-hypothesize that both ice cream sales and homicides increase together in warmer weather. We can prove that relationship with data, and while we cannot control the climate, we may be able to experiment with means of changing the interaction. For more complicated / complex problems, we may need to create and prove several separate but related hypotheses throughout our entire problem solving cycle like this simplified example:

  1. What problem (or gap) are we solving, or is there even a real problem to begin with? Politicians are great at creating problems that don’t exist but let’s stick with our aforementioned friend, the CEO. He obviously had a fiduciary responsibility to shareholders and a great concern about budget performance, but he didn’t understand that spending in his company shouldn’t be flat (poor assumption), that underspending could actually increase total cost (poor correlation), and that certain investments (like visiting customers) could have positive Net Present Values (poor understanding of value). His first hypothesis of “spending is too high in January and that will just continue throughout the year” was not just wrong, but detrimental to his overall goal of improving total profit. And from what he knew at the time, he couldn’t have been sure he had a real problem to begin with. A “problem statement” is just a hypothesis until it can be scientifically proven.
  2. What is the root cause or causes of this problem? Everybody likes simple, single root-cause explanations, like “The web server was down because Bob tripped on the power cord and knocked it out.” Unfortunately, life is rarely that simple. For example, it was recently reported that both a technical malfunction and the pilots’ response caused the deadly crash of AirAsia 8501 in 2014. And problems get even trickier when we try to change organizational behavior, or deal with variation problems that can all have multiple, interacting and/or constantly changing factors. While we often strive to simplify problems, we also need to make sure we don’t oversimplify them to our detriment – a common problem I often run into coaching executives. Based on little data and a cursory, biased analysis, the CEO incorrectly hypothesized that the root cause of his problem was that “all of his people were simply irresponsible.” A “root cause” is just a hypothesis until it can be scientifically proven. 
  3. Will our proposed solution(s) improve the problem(s) to an acceptable degree? Complex problems won’t always have 100% effective solutions, but for most problems we should be able to test for efficacy through a well-constructed experiment like a test market, pilot team, or DOE (Design of Experiments). We can also improve our confidence of success if we can turn a problem on or off at will and measure the results – like removing ice cream from bad neighborhoods and observing the outcome. After a repeated year of unchanged spending patterns, the CEO should have realized that his hypothesized solutions (including threats) were not just ineffective, but compounded the problems. “Solutions” are just hypotheses until they can be scientifically proven.   

Whether using A3s, the 5 Whys, DMAIC, Value Stream Mapping, or any other problem-solving methodology, many organizations don’t spend the effort necessary to prove all their hypotheses, and either under-solve the real problem or erroneously solve a non-existent issue. Along the way they are too quick to define their problems, rely on too many unsupported assumptions, or they get too impatient and excited while jumping to the solution phase that they skip over the necessary steps in between. The CEO could have easily created a very good-looking and convincing A3 around his budget issue but still never achieved a resolution. Why? Because he never addressed all his underlying hypotheses.

Don’t become an “anonymous former CEO” – use the scientific method to its fullest and prove ALL your critical hypotheses along the way.

The views expressed in this post do not necessarily represent the views or policies of The Lean Enterprise Institute.
Keywords:  culture,  leadership,  problem solving
Search Posts:
Lean Problem Solving
Mike Kobashi, Pascal Dennis, Sammy Obara, Tom Shuker & Tracey Richardson
By Brent Wahba
September 28, 2017 | 8 Comments
Was this post... Click all that apply
23 people say YES
40 people say YES
13 people say YES
26 people say YES
Related Posts
3 Comments | Post a Comment
John Shook February 16, 2016
5 People AGREE with this comment

Thanks, Brent, for another useful post about an important topic. Problem solving comes in many forms while at the end of the day settling into a few key pieces. Some random thoughts below...

Regarding the various problem solving methods you mention, surely you are correct that it is helpful to keep them in perspective. A related thought: most of those methods are less about “problem solving” than they are about “problem analysis”. DMAIC or 5Whys don’t solve problems so much as aid in diving into analysis and diagnosis. They can and should also be used to frame experiments to actually solve the problem, but most folks stop shorrt and once they think they’ve identified a cause that feels comfortable they then quickly apply the countermeasure that is most to their liking. A value stream map, for example, doesn’t solve any problems; it highlights problems in the current state and gaps between the current and target state so that experiments can be run, countermeasures applied, using the various lean tools.  

A few years ago, I stopped referring so much to the “scientific method” – with its familiar focus on developing a "hypothesis" (which ordinarily is meant to be disproven) – referring instead simply to “scientific thinking” or, better yet, simply “science”. You know, John Dewey, in his foundational articulation of the scientific method one century ago, referred to three types of scientific experiments, only one which entailed a “hypothesis” (the other two were experiments of comparison – “how does a compare with b” – and simple trial and error – “what would happen if…?”).  The list is endless of discoveries that came about in the absence of a hypothesis. Ah, before I get attacked for heresy, I should add that, yes, in the great, great majority of instances, we are best served to have a hypothesis. If nothing else, practicing with hypotheses helps us practice and learn to be better learners.  

Your discussion of the use of the various tools reminds me of Jon Miller’s insightful comment in response to Norbert Majerus recent Post about the A3 – what’s important is to conduct some form of P-D-C-A; the tools (A3 or whatever) are frameworks for applying PDCA under different conditions. That’s all. And in regards to solving problems as science, the “P” of PDCA is critical! Whether formulated with a clear hypothesis or not, the Plan frames the experiment for effectiveness (we have a problem to solve here!) and for learning (learning that is truly validated for future use in the form of what Al Ward and Durward Sobek call usable and reusable knowledge). 

By the way, I’m betting I know the former CEO you quoted. Wonder if the tactic won a fat bonus for him that quarter? - john

Reply »

Brent Wahba February 17, 2016

Thank you for your insights, John.  Agreed - there is much more to applying science within organizations than proving or disproving hypotheses.  Unfortunately, as you point out, there are often huge gaps between familiarization with the appropriate analysis tools, understanding how to apply them, and then actually solving real & important problems within a larger framework. 

The company I referenced thought they were doing so well with Lean, but in actuality couldn't see how to apply "problem solving" beyond crisis-level technical issues (much less learn how to prevent future problems).  Sadly, I think this is more often the norm.


PS I think you are right about the CEO's fat bonus.  Maybe the board should have used PDCA for executive compensation?       

Reply »

Darwin July 05, 2017

Sorry pls remove other fa

entire time book. Helpe

Reply »

Search Posts:
Lean Problem Solving
Mike Kobashi, Pascal Dennis, Sammy Obara, Tom Shuker & Tracey Richardson
By Brent Wahba
September 28, 2017 | 8 Comments
Ask Art: How Are Lean Teams Different?
Faster than a Speeding Kanban...
A Lean Leap of Faith
Accountability: Not What You Think it is...
Advice from the Gemba: How Can I Change a Culture?