Lean Enterprise Institute Logo
  • Contact Us
  • Newsletter Signup
  • Cart (0)
  • Account
  • Search
Lean Enterprise Institute Logo
  • Explore Lean
        • What is Lean?
        • The Lean Transformation Framework
        • A Brief History of Lean
        • Lexicon Terms
        • Topics to explore
          • Operations
          • Lean Product and Process Development
          • Administration & Support
          • Problem-Solving
          • Coaching
          • Executive Leadership
          • Line Management
  • The Lean Post
        • Subscribe to see exclusive content
          • Subscribe
        • Featured posts
          Lean AI Journal | The Boundary Is Moving: AI Changes What We Expect from Team Leaders, Managers, and Executives   

          Design Brief | Reinventing Product Development: People...

          The Jidoka Learning Cycle diagram showing five connected stages: Problem Detected, Andon Stop, Problem Solved, Embed Learning, and Improved System in a continuous loop

          Lean AI Journal | The Assembly Line...

          • See all Posts
  • Events & Courses
        • 2026 Lean Summit
          March 12-13
        • Forms and Templates
        • Featured learning
          • From Toyota to Tech: Wiring Organizations to Win

            November 17, 2025 | Online

          • Managing in Time with Daily Management

            January 09, 2026 | Coach-Led Online Course

          • Managing on Purpose with Hoshin Kanri

            January 20, 2026 | Coach-Led Online Course

          • The Lean Management Program

            February 06, 2026 | Coach-led Online Program

          • See all Events
  • Training & Consulting for Organizations​
        • Interested in exploring a partnership with us?
          • Schedule a Call
        • Getting Started with Lean Thinking and Practice
        • Leadership Development
        • Custom Training
        • Lean Enterprise Transformation​
        • Case Studies
  • Store
        • Book Ordering Information
        • Shopping Cart
        • Featured books
          Lean AI Journal | The Boundary Is Moving: AI Changes What We Expect from Team Leaders, Managers, and Executives   

          Daily Management to Execute Strategy: Solving problems and developing people every day

          Managing on Purpose Workbook

          Managing on Purpose

          • See all Books
  • About Us
        • Our people
          • Senior Advisors and Staff
          • Faculty
          • Board of Directors
        • Contact Us
        • Lean Global Network
        • Press Releases
        • In the News
        • Careers
        • About us

The Lean Post / Articles / Lean AI Journal | The Boundary Is Moving: AI Changes What We Expect from Team Leaders, Managers, and Executives   

Goal post with netting removed in back to suggest goal has moved

Lean AI Journal | The Boundary Is Moving: AI Changes What We Expect from Team Leaders, Managers, and Executives   

By Tyson Heaton

October 22, 2025

AI isn't replacing expertise—it's lowering the threshold where expertise becomes irreplaceable, forcing organizations to rethink leadership at every level.

FacebookTweetLinkedInPrintComment

About a year and a half ago at O.C. Tanner, I was doing a shop-floor visit when a night shift team leader showed me something that changed how I think about AI and manufacturing. He’d taught his iPhone to spot defects in aluminum finish. 

Paint defects were a recurring problem, issues regularly made it past team members. We’d discussed vision systems, but always dismissed them as too expensive or requiring specialized integration support. 

This team leader had been in the role for maybe six months and was relatively new to the company. He had moved through the ranks quickly, but was still learning the operation. But in weeks — not months, not years — he’d built a proof of concept for technology I’d written off as not applicable. He demonstrated what might take experienced operators 12 months of observations to develop. Technology held the complexity that used to live in human expertise. 

Since then the tools have only gotten better. The boundary is moving. Not someday. Now. By testing similar boundaries over a decade — the implications for organizational structure are more profound than most people expect. 

Testing the Boundaries: The Rotation Experiment 

At O.C. Tanner, an employee recognition software and custom awards company, we spent a decade systematically rotating manufacturing leaders across operations. Distribution to assembly to CNC milling to casting. Deliberately crossing boundaries most manufacturers treat as uncrossable. 

The goals: Develop leaders who could improve any operation, not just the one they’d spent 20 years mastering. Build lean capability that transferred across contexts. 

O.C. Tanner gave us a natural laboratory for this. We ran one of the most diverse manufacturing operations you’ll find in one company: specialized CNC mills, electrocoating with precise chemistry control, metal alloy refining, cast trophy production with complex heat-treat requirements, high-volume distribution, laser engraving, and complex personalization. Simple to complex. Standard to custom. Forgiving to unforgiving. 

Here’s what we learned about leader rotations: 

  • 18 months minimum: Rotate someone faster, and they couldn’t get significant improvements over the finish line. Teams experienced it as churn — new leader, new priorities, abandoned initiatives. 
  • 5 years maximum: Leave someone longer, and they’d stop learning. Performance plateaued. Fresh perspectives disappeared. 

The rule, however, didn’t apply everywhere. In distribution and simpler manufacturing (fast feedback loops, forgiving errors, well-documented standards) rotations worked beautifully. In die and tooling, casting, and e-coat (slow feedback loops, catastrophic error consequences, high tacit knowledge) rotations were much harder. Three variables explained the difference: 

  • Feedback loop speed: Distribution? See results in hours. Casting? See results in days. Fast feedback means rapid learning. Slow feedback means expensive mistakes before you know you made them. 
  • Consequence of error: Assembly? Rework the unit, try again tomorrow. E-coat? Contaminate the chemistry bath, recovery measured in weeks. 
  • Knowledge type: Documented standards vs. pattern recognition built from 10,000 observations. Explicit knowledge transfers easily. Tacit knowledge that resists documentation transfers slowly, if at all. 

We had advantages most manufacturers don’t: 15-year average tenure meant the team could carry a new manager through the learning curve. A psychologically safe culture meant leaders could say, “I don’t know this yet, teach me” without losing credibility. 

When rotations broke — when humility wasn’t present, when a manager tried to fake expertise — team morale suffered. Quality slipped. The rotation became a tax everyone else paid. 

These three variables showed us where complexity lived. 

Technology Started Carrying More Load 

During the last five years, we watched the complexity barrier drop. We used Confluence (a collaborative workspace for teams to organize information) to build better knowledge access for team members and team leaders. We invested in RPA (robotic process automation) to offload complexity. The rotation experiment showed us where we were lacking, and we systematically addressed gaps. Technology started holding more of what used to require human memory and expertise. 

And now, with generative AI, RAG databases, and plain-language translators, I’m watching that barrier drop faster than I ever expected. Remember those three variables that predicted rotation success? AI shifts the equation: 

  • Feedback loops get shorter when AI-enabled teams and tools surface patterns faster. 
  • Consequences of error drop when AI flags “this is outside normal range” before catastrophic failure. 
  • Knowledge retrieval shifts when AI makes explicit knowledge more widely available, in plain language, with a built-in patient glossary. 

The zones don’t disappear. Casting is still harder than distribution. But the boundary moves. Operations that required five years of technical depth might now work at three years. Tasks that needed 18 months of tacit knowledge and recall might work at six months with AI augmentation. 

This is the pattern: AI isn’t replacing expertise — it’s lowering the threshold of complexity where expertise becomes irreplaceable. 

When technology can hold complexity, the question becomes: what do we need humans to do? 

AI Changing How Organizations Work 

If AI lowers complexity barriers, the implications ripple through every level of the organization. 

Team leaders become the linchpin. 

In most organizations, the team leader is often a first responder firefighter, a line worker wearing an extra hat, or an admin clerk. That’s if the organization has invested in the role at all. In a world where organizations that evolve more quickly survive, this has to change. 

A team leader’s ability to observe work firsthand, spot emerging practices, and socialize them across the team becomes the core driver of performance. The focus becomes studying and improving the work itself. 

When AI enables rapid localized change (individual team members building their own workflows, solving problems in their own way) it becomes imperative to have someone studying and socializing those changes. The work increases: more variation to track, more methods to evaluate, more decisions about what to standardize. Their role becomes critical: observing work firsthand, ensuring people use the best methods, socializing standards so team performance improves and methods don’t diverge. 

Middle management shifts to cross-functional facilitation. 

Middle management loses much of its historic role as summarizer and interpreter of technical knowledge. AI can aggregate information, spot patterns across data, and generate technical summaries. 

What AI can’t do, however, is navigate the social systems that enable change, build trust across functions, or accelerate the adoption of effective practices across value streams. 

Middle management’s value shifts entirely to cross-functional facilitation. They become the people who can look end-to-end across value streams, not domain protectors who aggregate reports upward. They work on relationship management, social system stability, and helping practices transfer from one area to another. 

At O.C. Tanner, we’d intentionally added layers of middle management (not to aggregate reports, but to facilitate cross-functional improvement work). After 26 years in a lean journey, we knew local optimization wasn’t enough. We needed people looking end-to-end across value streams. 

The problem we encountered was that operations leaders were often the right people for this work, but they got disqualified by presentation polish, insufficient capacity, or narrow focus on just their area (because collaboration is hard). An operations manager who has been a few places knows the value stream intimately. They know that 30 more minutes on the shop floor — observing work firsthand, connecting with people — is valuable time. But then they got pulled into conference rooms to explain it, and someone with better PowerPoint skills became the voice of the project. The operations leader lost authorship of the improvement. 

AI lowers this barrier significantly. The manager who understands the operation can now generate polished communications without needing a translator; stay connected to the work while meeting documentation requirements; and step into cross-functional facilitation confidently. 

Executives get exposed. 

Technical support and functional departments shift from guarding specialized knowledge to enabling integrated flow of value. AI democratizes technical knowledge. You don’t need the legal expert to tell you what the regulation says, you don’t need the compliance specialist to interpret the rule. The technical experts who built their value on “only I know this” lose their moat. What remains is the work AI can’t do: applying judgment in ambiguous situations, navigating the social systems that enable change, helping the organization learn. 

Executives who sit on top of those technical kingdoms get exposed. The ones who relied on being rulers of specialized domains have nowhere left to hide. Their value has to shift entirely to actual strategic work: setting enterprise-wide direction, building shared purpose, making choices about where to compete and how to win. The executives who relied on “knowing more than you” rather than “seeing farther than you” won’t survive. 

Silos that were originally created to protect technical expertise come under pressure. Social capability — navigating relationships, building shared purpose, leading change through doing — rises in value. 

Governance has to adjust to reward those who drive cross-functional flow, not just those who manage within their domains. 

AI Impact to Knowledge Work 

Here’s where it gets harder: in knowledge work, divergence is invisible. 

In the manufacturing shop floor, when you leave an operation and come back three weeks later, you can see what’s drifted just by observing. Without the proper leadership routines, things shift. Team members come up with creative “hacks” to keep the line running, adjust equipment to cover the slop that gets passed to them, new tools and techniques emerge. And you can see it. 

In knowledge work, you can’t. Just this morning, someone asked where our video assets were stored. We discovered seven different storage locations. Files had migrated over the years — partly based on where content was created, partly based on people’s preferences, partly due to simple drift. 

Ask a team what tools they use to complete the same editing function, and you’ll hear: Grammarly, ChatGPT, Claude, a human editor, or combinations of these things in different order. All different. All invisible unless you ask. 

On the sales side, the techniques people use to capture insights from calls are all over the board. Different tools, different methods, different levels of detail. No one knows until you observe the actual work firsthand, and even then you might have to ask about connections and thinking. 

Here’s the problem: in manufacturing, I can walk the floor and spot tool modifications. In knowledge work, I’d have to leverage paired working to extract how they’re actually working. I’d need to dig into the prompts they’re using, the instructions they’ve saved, the workflow automations running in the background. The work methods are hidden. 

And with AI, this divergence accelerates. Everyone now has access to powerful tools that let them build their own workflows, create custom solutions, develop personal systems. The speed at which methods can diverge — and the sophistication of those divergent methods — is unprecedented. 

The acceleration of divergence with knowledge work makes all three organizational levels even more critical: 

  • The team leader has to actively study how work gets done, member by member — what technology they’re using, what frameworks they’re applying, what invisible modifications they’ve made to their workflows. Without this, you get what we’re all seeing: Work slop. Inconsistent outputs. Quality drift. People solving the same problem in seven different ways, none of them talking to each other, all of them slowly degrading from whatever standard existed. 
  • The middle manager as cross-functional facilitator becomes essential. When everyone has different invisible workflows, coordination costs explode. Someone has to look across the value stream and ensure methods align where they need to, diverge intentionally where variation adds value, and don’t just drift randomly. 
  • The executive who was hiding behind domain expertise has nowhere left to hide. AI democratizes technical knowledge. What remains is the work AI can’t do: setting enterprise-wide direction, building shared purpose, making choices about where to compete and how to win. 

What AI Doesn’t Change 

AI can’t solve the social and political problems that made rotations hard. 

We needed team members willing to cover gaps without resentment. Peer managers willing to share resources. Executives patient enough to accept ramp times. A culture where “I don’t know yet” was acceptable. 

AI helps with knowledge transfer. It doesn’t help with the frustrated team member, the protective peer manager, the impatient executive, or the political environment where admitting gaps kills your career. 

And humility becomes more important, not less. 

Here’s the new failure mode: A manager rotates into a complex operation. They’re uncomfortable with gaps. They ask AI for guidance. AI gives a plausible answer — because AI is always plausible. The manager implements it without checking with experienced team members because that would reveal they needed help. 

Result: AI-enabled false confidence. This is worse than manually faking expertise. Because now they have a credible-sounding source: “The AI said to do it this way.” 

Rotations only worked when leaders admitted gaps openly, used the team’s knowledge to validate their thinking, and moved at the pace of genuine learning. AI tempts leaders to skip validation. “I don’t need to ask my team — I asked the AI.” 

We need leaders who can say: “The AI suggested this. Does that make sense given what you’ve seen? What am I missing?” Leaders who treat AI as a bypass for humility will create damage faster than before. 

The Reality Check 

Here’s what will probably happen. Most organizations will set expectations for executives to implement AI quickly. They’ll prop up key performance indicators around AI use and number of jobs cut. Specialized domains will work quickly to build their own instances, their own data governance tools, specifically sanctioned for their expertise. Social and class dynamics (who gets AI tools and who doesn’t) will play out in predictable ways. 

For many, AI will amplify an already unhealthy culture of politics, competing work systems, management classism. But there will be a few organizations who will emerge as having “magically figured it out.” 

A few organizations with the right mix and the courage to build capability will see the potential in this new technology. They’ll use it to set a new standard — not just for operational excellence, but for human respect. They’ll leverage this potential to carve out new models and ways of working. 

They’ll be the ones where team leaders become the linchpin because observation and practice matter more than credential and tenure. Where middle management focuses on cross-functional value stream work because AI handles the summarizing. Where executives get exposed if they’ve been hiding behind domain knowledge instead of setting real direction. Where technical departments enable flow instead of guarding expertise. 

That’s the bet worth making. Not because AI solves everything. But because it lowers enough barriers that organizations willing to do the hard cultural work — the work of building humility, psychological safety, and genuine collaboration — can finally design the structures their lean principles always implied. 

The boundary is moving. The question isn’t whether AI changes what’s possible. It’s whether you have the courage to build the capability and culture to make those possibilities real. 


What are you seeing in your operations? Where is AI lowering barriers you thought were fixed? And what’s stopping you from testing it? I’m still figuring this out — I’d genuinely value hearing what you’re experiencing: theaton@lean.org 

Get insights into AI applications through the lens of Lean Thinking delivered to yout inbox. Subscribe to the Lean AI Journal on LinkedIn.

Subscribe »
FacebookTweetLinkedInPrintComment

2026 Lean Summit

The premier leadership conference shaping the future of lean management for every business.

Written by:

Tyson Heaton

About Tyson Heaton

Tyson Heaton is Executive Director of LeanTech/AI and Senior Coach at the Lean Enterprise Institute, where he leads efforts to bridge lean thinking with technology implementation. His background spans manufacturing operations at JBS, Schreiber Foods, and Greencore, followed by leadership roles at O.C. Tanner addressing scalability, legacy system modernization, and supply…

Read more about Tyson Heaton
Comments (1)
Sean Mullettsays:
October 23, 2025 at 11:13 am

Thank you for the knowledge share.

It was very insightful & enlighteing about how Lean, Leadership, & AI can evolve for the good or the bad.

One thing is for sure there is only one constatnt & that is ‘Change’.

I will revisit this article many times & apply the concepts to understand what good can be done for us!

Thank you again for such an intelligent articel.

Reply

Leave a Comment Cancel reply

Your email address will not be published. Required fields are marked *

  • Privacy Policy
  • Sitemap
  • LinkedIn
  • Twitter
  • YouTube
  • Instagram
  • Facebook

©Copyright 2000-2025 Lean Enterprise Institute, Inc. All rights reserved.
Lean Enterprise Institute, the leaper image, and stick figure are registered trademarks of Lean Enterprise Institute, Inc.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Learn More. ACCEPT
Privacy & Cookies Policy

Privacy Overview

This website uses cookies to improve your experience while you navigate through the website. Out of these cookies, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may have an effect on your browsing experience.
Necessary
Always Enabled
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Non-necessary
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.
SAVE & ACCEPT