Home > The Lean Post> Technology-Driven Improvement Initiatives (and Other Amazing Magic Tricks)
The Lean Post
Sharing how the world is making things better through lean.

Technology-Driven Improvement Initiatives (and Other Amazing Magic Tricks)

by Tim Kane
April 26, 2017

Technology-Driven Improvement Initiatives (and Other Amazing Magic Tricks)

by Tim Kane
April 26, 2017 | Comments (6)

"All my tubes and wires
And careful notes
And antiquated notions…"

-Thomas Dolby, “She Blinded Me With Science”

Over the years I have noticed that successful improvement projects tend to share some key traits. Projects that employ an evidence-based methodology fare far better than those that do not. Some of these projects self-identify as “lean” and some do not, but are rooted in the same fundamental concepts. PDCA, DMAIC/DMADV, Requirements-Design-Development-Testing-Maintenance (Agile/Scrum), and Empathize-Define-Ideate-Prototype-Test (the “flexible flyer” of recent PDCA spins?) are all variations on the same scientific method theme.

In my experience, projects based on the scientific method are more likely to succeed than projects focused primarily on Information Technology (IT) improvements. I am grateful to the smart folks at Gartner for branding their technology-market analysis work “Magic Quadrants,” because when we rely too heavily on market-leading software solutions to solve our bigger problems, we tend to abandon strategy-driven problem solving disciplines, relying instead upon technology-focused, solution-presuming, pseudo-scientific IT magic. I’ll take Hoshin over Houdini any day.

Culturally, perhaps the most fundamental impediment to achieving the ever-elusive results-based accountability ideal is widespread belief in IT magic or, put another way, the lack of strategic, evidence-based rigor in tech-driven problem solving practices. Kent Toyama, author of “Geek Heresy: Rescuing Social Change from the Cult of Technology,” makes a compelling case in an MIT Tata Center interview, as does lean IT guru Steve Bell in this YouTube Video.

I suggest there are two main reasons why technology-focused initiatives so often over-promise and ultimately under-deliver. For each reason I offer a remedy, rationale and real-life example.

Reason 1: Underestimating Cost and Complexity.

We love fixing big problems with big, comprehensive, out-with-the-old-in-with-the new solutions. But while the efficiencies to be realized through an IT solution may seem self-evident, implementation invariably proves far more costly and complicated than estimated.


Pursue cheaper, simpler non-IT opportunities for improvement FIRST and save the big tech for later.


  • IT problems are never the sole cause of a performance gap or broken process and rarely are the primary cause.
  • The bigger the IT solution, the more dubious the ROI (see: most enterprise application implementations since the dawn of time).
  • Applying a technology solution to a broken process without first pursuing non-technology opportunities for improvement virtually ensures that:
    • Process problems persist, in spite of shiny new technology;
    • Technology solution implementation will take longer and cost more, because customer value, underlying processes and real business needs are not adequately understood.
  • Without adequate current state process analysis, external consultants (or well-meaning internal IT professionals) end up making faulty assumptions to fill in critical gaps of understanding. Subsequent project time, cost and quality issues can often be traced back to those uninformed assumptions.


Our client, a health insurer, had a problem with their new group enrollment process lead time of 28 days, compared to industry average under 10. Our first day onsite, the COO gave us a directive which seemed unduly restrictive to some of us, but which we all eventually came to understand as a key enabler of our success: TECHNOLOGY IS OFF THE TABLE! With tech off the table, all we could do was map the process, sort out the value from the waste, maximize the former, minimize the latter, pilot test and confirm our hypotheses, restructure and retrain the enrollment team accordingly, and track the metrics. The new process was implemented in less than three months and, running on legacy technology and paper forms, the new lead time was a sparkling 5-7 days, best-in-class!

Most importantly, that quick win served as the “ah-HA!” moment for leadership for fixing their long-suffering platform implementation project.  Process analysis would now inform requirements gathering in a meaningful way, with internal process champions, not outside software consultants, leading the charge.

Reason 2: Techies with the best intentions can sometimes be “blinded with [information] science” (aka the Thomas Dolby Effect). 

For some IT leaders and analysts, a predisposition toward technological solutions can cloud (pun intended) the ability to acknowledge larger strategic considerations and/or recognize non-technology opportunities for improvement. To these well-meaning folks, software solutions are simply “poetry in motion,” and the biggest problems are best “solved” by big technology solutions.  In other words, the most fervent believers in IT magic can be the IT magicians themselves.


Reject the notion that technology expertise is the primary skill required to lead or facilitate an improvement project – even (or perhaps especially) for enterprise application implementation projects (ERP, CRM, etc).


  • Acknowledge that deep IT expertise can be accompanied by a deep bias toward technology-focused “pre-solutioning”.
  • Non-technologist (or less technically-savvy) project leaders and facilitators may be more likely to pursue a tech-neutral, evidence-based, strategy-driven approach to problem solving.  Favor lean (or other quality discipline) expertise and business acumen over IT expertise for project leadership and facilitation.
  • When software solutions drive analysis, the focus is often on “users” and “fields”.  By employing strategy-driven, evidence-based analysis, even in software implementation projects, the focus stays where it needs to be -- on “stakeholders” and “customer value”.
  • “Requirements gathering” does not equal adequate current state analysis – an important distinction not always recognized by IT leaders and analysts.  Before asking what business requirements a software solution needs to address, business analysts should be asking these questions (among others):
    • What is the real problem here?  What are its root causes?
    • Can the problem be broken down into smaller (and more readily understandable and solvable) chunks?
    • Who is the customer?  What does the customer value in this process?
    • What are all the viable opportunities for improvement, and which ones should we pursue first?


On another health insurance engagement, the client was in the market for some slick new “provider credentialing” software.  I was brought in to research the leading solutions (Magic Quadrants, Shazam!) and manage the implementation process.  Luckily, the client didn’t get a burr in their saddle when I advised them to hold their horses on the software project.

First we assessed how providers generally were “managed” and found there were three distinct departments managing provider data using distinct, home-grown excel tools. Turf disputes were common, “siloed” processes persisted and cross-functional data sharing was anything but de-rigeur.  Instead of focusing on software, our initial kaizens would be focused on team building and replacing arbitrary functional segregation with customer value-maximizing, flow-enhancing work cells.  Addressing root cause conditions took some time, but would enable the later successful selection and implementation of a more comprehensive software solution.

Unfortunately I do not possess an accurate Return-On-Investment (ROI) divining rod that indicates whether any improvement project will deliver the goods as estimated. Without a doubt, technology-driven improvement initiatives fail for many reasons beyond than the two I describe. But I have observed that the average post-mortem analysis of a disappointing IT project often does not proceed beyond one or two levels of “why” inquiry and fails to reveal true root cause conditions. At its roots, I see a problem of technology-based biases and beliefs undermining the reliable, non-blinding science of evidence-based improvement methods.

The views expressed in this post do not necessarily represent the views or policies of The Lean Enterprise Institute.
Was this post... Click all that apply
8 people say YES
11 people say YES
8 people say YES
13 people say YES
Related Posts
6 Comments | Post a Comment
Gary Mahood April 26, 2017
3 People AGREE with this comment

Hi Tim,

Excellent article!

As a long time technology leader who has embraced lean over the last number of years the great majority of the views expressed in your piece mirror my own thinking and experience very closely. 

In particular I think your points around ensuring that some process improvement initiatives should generally precede “computerizing” a process is excellent advice that is not widely followed…. So many times the current as-is process is mapped as being the user requirement… and guess what? Nothing much improves!

(A small caveat here is that this approach is not appropriate where the initiative in question is designed to introduce some element of digital disruption, or is of a more experimental nature)

In relation to large scale implementations (e.g. monolith ERP such as SAP / Oracle) I think your point around dubious ROI is well made. The real difficulty is in quantifying the incremental benefit of going with a large integrated technology solution vs spending the same amount of time and effort (and money) on alternative approaches e.g. process improvement coupled with smaller “point” solutions”.




Reply »

Tim Kane April 26, 2017

Thanks Gary -- I like your digital disruption caveat.

Reply »

Ralf Lippold April 26, 2017

Great article, Tim.

It reflects my own experience working in a car OEM for many years. Questioning the current processes, and even more so observed behaviors of people in them always was more valuable than trusting the “experts“ that new IT solutions will bring the help.

As a passionate lean thinker myself I learned and live since that jargon and technology don't bring lasting positive impact if the underlying organizational structures and behaviors are not questioned. In the very end it was a purely “going to the Gemba“ and applying “Humble Inquiry“ to let the processes run smoothly again (and even scale them additionally).

When we started to build the plant from scratch the most profound impact was merely with paper and pencil bringing everybody on the same page in almost no time at all.

Reply »

Tim Kane April 26, 2017

Thanks Ralf.  Nothing quite like the old paper and pencil approach, especially for A3s.  I've seen too many teams get hung up on doing A3s in excel and getting turned off by the formatting constraints.  Smartphone photos of whiteboard A3s might be my favorite facilitation hack of this century (so far).

Reply »

Venanzio Figliolino April 27, 2017

Absolutely a good article, usually during my workshops I say to the partecipants that in the first step of improvement the IT solutions are not accepted :-)





Reply »

Tim Kane April 27, 2017

Sage advice, Venanzio.  Thanks for sharing.

Reply »

A Sweeter Type of Lean
A Week of Kaizen in Just One Day
All Lean Is Local
Are Your Processes REALLY People-Centric?
Please include links as plain text URLs only. Do not copy and paste directly from a web page or other document. Doing so may pick up additional HTML that will not function here.
URLs will be converted to functioning links when your comment is displayed on the site.
Here's an example:
See this article for more details: https://www.lean.org/whatslean