
The 20% of Work AI Can't Do Is the Only Work That Matters
Last week, I watched a room full of successful and experienced leaders discover they'd been using AI like a vending machine.
Put in a question, get an answer, move on. Fast, efficient, done.
Except it wasn't done. Not really.
We were halfway through a webinar on prompting when I asked participants to try something different. Take the same question they'd asked AI before, the spontaneous, natural-language kind we all default to, and try again. This time with structure. Specificity. Clear boundaries. An actual goal beyond "give me something."
The difference in outputs wasn't incremental. It was staggering.
Same tool. Same people. Completely different quality of thinking.
And that's when it hit them. The problem wasn't the AI. It was the abdication. They'd been letting the tool do the thinking instead of making it think with them.
This matters more than most leaders realize.
Because right now, in boardrooms and strategy sessions across every industry, we're watching something dangerous unfold:
A generation of leaders is falling in love with speed and mistaking it for progress. They're using AI to produce faster, not think better. And in doing so, they're handing over the one thing that actually sets them apart: strategic judgment.
We're at a fork in the road. One path leads to efficiency. The other leads to irrelevance. And the difference between them isn't the technology. It's whether we remember how to use our brains.
The Seduction of "Good Enough"
Here's what's happening in organizations right now.
A marketing director needs a campaign brief. She opens ChatGPT, types "create a campaign brief for our new product launch," and thirty seconds later, she's got something. Sections, bullet points, even a timeline. It looks professional. It sounds right. She makes a few tweaks, sends it to her team, and moves to the next task.
Efficient? Absolutely.
Strategic? Not even close.
Because what she got was a template dressed up as thinking. The AI gave her the structure of a campaign brief without understanding her market, her competitors, her brand's actual positioning, or the nuanced challenge she's trying to solve. It couldn't. It doesn't know those things unless she told it, in detail, with precision.
But she didn't. Because she didn't realize she needed to.
This isn't her fault. The AI is designed
to be helpful,
to fill in gaps,
to anticipate what you might want.
And that helpfulness is the trap.
When you ask a vague question, AI gives you a vague answer wrapped in confident language.
It offers you next steps.
It suggests directions.
It acts like it knows where you're going.
And if you're not careful, you follow along, letting the tool's logic replace your own.
I see this constantly. Leaders who are brilliant, experienced, sharp, handing over their strategic thinking to an algorithm because it feels like collaboration. It's not.
It's outsourcing.
And the cost is invisible until it's too late. Until every company's strategy starts sounding the same. Until your team can't tell your voice from the AI's. Until you realize you've spent six months being productive without actually moving the business forward.
What Actually Happened in That Webinar
Back to that room full of leaders.
We started with a simple exercise. I asked them to use AI the way they normally would. Someone wanted help with a client proposal. Another needed a framework for team restructuring. A third was drafting a keynote.
They typed their prompts, natural and conversational, the way we all do. "Help me write a proposal for a consulting engagement." "Give me ideas for restructuring my team." "Draft an opening for my keynote on leadership."
The AI responded. Competently. Generically. Safely.
Nothing wrong with the outputs, technically. But nothing remarkable either. The kind of content that exists everywhere already, reshuffled and repackaged.
Then we did it again. Same questions. Different approach.
This time, I had them structure their prompts.
Define the goal precisely.
Describe the audience.
Specify the tone.
Provide context the AI couldn't guess.
Set boundaries on what they didn't want.
Give examples of what good looked like.
The difference was immediate.
The consulting proposal went from a list of services to a tailored pitch that spoke directly to the client's actual pain points. The restructuring framework moved from org-chart theory to a plan that accounted for the team's real dynamics and constraints. The keynote opening went from motivational fluff to a story that landed with purpose.
Same people. Same tool. Radically different quality.
And here's what they realized: the AI didn't get smarter. They did.
They stopped asking the AI to do their thinking and started using it to extend their thinking.
They brought
clarity,
judgment,
and direction.
The AI brought
speed,
structure,
and processing power.
Together, they produced something neither could do alone.
But here's the part that surprised them most.
I asked them to take the AI's improved output and refine it. Not accept it as-is, even though it was already better.
Push it further.
Add the nuance only they knew.
Cut the parts that sounded right but weren't quite true.
Inject the human context, the lived experience, the strategic intuition that no prompt could ever capture.
That last 20 to 30 percent? That's where the magic happened.
That's where the generic became specific. Where the competent became compelling. Where the output stopped being something AI generated and became something they created, with AI's help.
And that's the difference most leaders are missing.
The Real Trap: AI's Helpful Offers
There's a moment in almost every AI interaction that I've come to recognize as dangerous.
You've just gotten a response. It's pretty good. And then the AI offers you next steps.
"Would you like me to expand on section three?"
"Should I create a timeline for implementation?"
"Here are five additional angles to consider."
It feels helpful. Collaborative, even. Like the AI is anticipating your needs.
But here's what's actually happening:
you're being pulled into the AI's structure. Its logic. Its assumptions about what comes next.
You came in with a plan:
A structure you'd thought through.
A direction that made sense for your specific context.
And now, subtly, you're following the AI's lead instead of your own.
This is where friction dissolves. Where you lose the thread of your original thinking. Where the process that was supposed to save you time starts eating your strategic clarity instead.
I watched this happen in real time during the webinar. Someone would get a solid response, and the AI would suggest three follow-up questions. Reasonable questions. Relevant, even. And the person would reflexively answer them, moving deeper into the AI's framework, further from their own.
Ten minutes later, they'd look up, confused. "Wait, how did I end up here? This isn't what I was trying to solve."
Because they'd stopped steering.
The antidote is simple but requires discipline: ignore the AI's offers unless they align with where you were already going.
You're the strategist.
You set the direction.
The AI is there to execute your thinking faster, not to think for you.
This is harder than it sounds. Because the AI's suggestions often sound smart. They create the illusion of momentum. And when you're busy, momentum feels like progress.
But momentum in the wrong direction is just expensive distraction.
What We're Really Losing: Strategic Thinking
Let's talk about what's actually at stake here.
When leaders use AI as a shortcut instead of a thought partner, they're not just producing mediocre outputs. They're atrophying the skill that matters most: strategic thinking.
Strategic thinking isn't about having the right answer. It's about asking the right questions.
Seeing patterns others miss.
Connecting disparate information into insights.
Knowing what matters and what doesn't.
Making judgment calls when the data is incomplete.
You can't outsource that to an algorithm.
AI can process more information than any human. It can identify patterns, generate options, simulate scenarios. It's brilliant at breadth.
But it has no depth. No context. No stakes.
It doesn't know what it's like to sit across from a client who's about to lose their business. It hasn't felt the tension in a leadership team that's losing trust in each other. It can't read the room when a strategy is technically correct but culturally impossible.
That's your job. That's the human work.
And when you skip the refinement, when you take the AI's output and run with it because it's "good enough," you're not saving time. You're practicing a form of strategic abdication.
Over time, that abdication becomes a habit.
You stop questioning.
You stop refining.
You stop thinking deeply because the AI gave you something that sounds reasonable.
And then one day, you realize you've lost the edge. Your competitors are still thinking. You're just executing faster.
Speed without strategy is just expensive motion.
The Evolutionary Moment We're In
Here's what I believe we're witnessing: an evolutionary moment in how humans think and lead.
Every major technological shift has forced us to redefine what makes us valuable.
The printing press didn't make scholars obsolete, it made memorization less critical and synthesis more important. Calculators didn't eliminate mathematicians, they freed them to work on problems machines couldn't solve.
AI is the same. It's not replacing thinking. It's changing what kind of thinking matters.
The leaders who thrive in the next decade won't be the ones who use AI fastest. They'll be the ones who use it most strategically. Who understand that the tool's value isn't in what it can do alone, but in how it amplifies what they already bring: judgment, experience, context, intuition.
This requires a shift in mindset.
Stop seeing AI as a productivity hack.
Start seeing it as a collaborator that needs direction.
Stop asking it to do your thinking.
Start using it to extend your thinking.
Stop accepting its outputs as final.
Start treating them as drafts that need your refinement.
The 20 to 30 percent you add at the end, the part where you bring your humanity, your expertise, your strategic sense, that's not extra work. That's the entire point.
Because that's the part that can't be copied. That's the part that sets you apart. That's the part that turns competent into exceptional.
AI gives you time back. The question is: what are you doing with it?
Are you using it to do more tasks? Or are you using it to think more deeply about the tasks that actually matter?
What Changes When You Get This Right
When leaders figure out how to collaborate with AI instead of outsourcing to it, something shifts.
First, the quality of their work jumps. Not incrementally. Dramatically. Because they're combining the speed and processing power of AI with the judgment and nuance only humans have.
Second, they get faster at the things that matter and slower at the things that don't. AI handles the scaffolding, the structure, the first draft, the research synthesis. They focus their time on the strategic calls, the refinements, the human touches that make work resonate.
Third, they stop feeling overwhelmed. Because they're not drowning in AI-generated options. They're steering the process. They're in control.
And fourth, they start to see AI not as a threat to their relevance but as proof of it. Because the better the tool gets, the more obvious it becomes that the human part, the strategic part, the judgment part, is what actually creates value.
This is the shift we need more leaders to make.
Not rejecting AI.
Not blindly adopting it.
But learning to use it the way you'd use any powerful tool: with intention, skill, and respect for what it can and can't do.
The Collaboration That Actually Works
So what does good collaboration with AI actually look like?
It starts with clarity. Before you ask the AI anything, get clear on what you actually need. Not "help me with this." But "here's the specific problem, here's the context, here's what good looks like."
Then you structure your prompt. You give the AI enough information to produce something useful. You set boundaries so it doesn't wander into irrelevance. You provide examples if you have them. You define the goal.
You let the AI generate. And you resist the urge to accept the first output as final, no matter how impressive it looks.
You read it critically.
You ask: is this actually true? Does this fit my context? Is this strategic or just plausible? What's missing? What needs to be cut?
You refine.
You add the nuance. The lived experience. The judgment calls. The things only you know.
And then, only then, do you use it.
This process takes longer than copy-paste. But it produces
work that's exponentially better,
work that sounds like you,
work that reflects strategic thinking, not template filling.
And once you build this muscle, it gets faster. You learn what kind of prompts produce quality. You get better at spotting what needs refinement. You develop an instinct for when to trust the AI and when to override it.
You become a better thinker because you're forced to articulate what you actually want, with precision. And that clarity doesn't just make the AI better. It makes you better.
What I Want You to Take From This
If you're a leader using AI, or thinking about using AI, here's what I need you to hear.
The tool is powerful. Use it. But don't let it use you.
Don't mistake speed for thinking.
Don't let the AI's confidence replace your judgment.
Don't skip the refinement because the output looks good enough.
Because "good enough" is the most expensive standard you can adopt.
You're a strategist. Your job isn't to execute tasks faster. It's to see what others miss, to make calls others can't, to bring the experience and context that no algorithm has.
AI doesn't threaten that. It amplifies it, if you use it right.
So use it as a thought partner, not a replacement.
Give it direction.
Refine its outputs.
Add the human layer that makes work resonate.
That's where the value is. That's where your irreplaceability lives.
We're at an evolutionary moment. I am absolutely certain that the leaders who figure out how to think with AI, not through it, are the ones who'll define what leadership looks like in the next era.
The question isn't whether you'll use AI.
The question is whether you'll still be thinking when you do.
You and your team want to experience how you can do real magic with AI?
Let's schedule a Zoom and talk about it:
https://powerhubforleaders.com/connection-call
Can't wait to see your reaction, once you discover the true power of humans in collaboration with AI
Warmly
Birgit
Your Transformational Ally
