The Thing You Know
Here's what's actually happening.
AI didn't suddenly make humans obsolete. What it did was destroy the market value of one specific thing: producing passable first-draft outputOn Upwork, writing jobs dropped 33%, translation jobs 19%, and customer service jobs 16% after widespread AI adoption. First-draft work didn't decline gradually — it collapsed. Bloomberry, 2024. Writing a decent email. Summarizing a report. Generating code that mostly works. Building a slide deck. Drafting a plan.
But here's what AI cannot do. It cannot know what should exist in the first placeIt can produce anything you describe. It cannot decide what's worth producing. That decision requires context, values, and stakes — things that come from living, not computing..
It can write a thousand emails. It cannot know which email needs to be sent, to whom, and why this one matters. It can generate a business plan. It cannot know whether the business should existGeneration is cheap now. Judgment about what to generate is the scarce resource. That's what you carry..
That capacity — knowing what should exist — is yours. It comes from living in the world, dealing with real situations, and building the kind of understanding that only comes from experience, context, and giving a damn about the outcome.
A calculator didn't destroy math. It destroyed the market value of arithmeticThe skill didn't become less real. It became less rare. That's the pattern every time: the tool commoditizes the execution, and the value migrates to whoever knows what to execute and why.. The people who understood when to multiply and why were more valuable than ever. They just needed to learn to use the calculator.
That's where we are now. Except it's not math. It's everything.
In a controlled experiment with 453 professionals, ChatGPT cut task time by 40% and bumped quality by 18%. The biggest boost went to workers who needed it most — the ones doing routine first-draft work. Noy & Zhang, Science 2023
When 758 Boston Consulting Group consultants used GPT-4, they got 25%+ faster and 40%+ better on tasks AI handles well. But on problems requiring real judgment, performance dropped 19%. The tool helped until it didn't. Dell’Acqua et al, HBS 2023
of knowledge workers already use AI at work — but only 46% say they actually trust what it gives them. The tool is everywhere; the confidence isn't. KPMG, 2025
What decisions at your work or school could not be made by someone who just arrived?
You already know something AI doesn't.
Everybody carries knowledge that didn't come from a textbook. Maybe you know how to read a room. Maybe you know how to keep three kids fed on nothing. Maybe you know how to navigate a bureaucracy that was designed to make you give up. Maybe you can look at a dataset and feel that something's off before you can explain why.
Whatever it is — you know it in your bones. You built that knowledge by living. Philosopher Michael Polanyi had a name for this: tacit knowledgeThe things we know but can't fully explain. You can ride a bike, but try writing instructions for it. Polanyi argued this kind of knowledge is fundamentally untransferable through words alone. Read more — the things we know but cannot fully put into words. Economists call the automation problem this creates Polanyi's ParadoxTasks requiring adaptability, common sense, and creativity stay stubbornly resistant to automation — no matter how powerful the technology gets. The paradox: we can't automate what we can't formally describe. Autor, NBER 2014: you can't automate what you can't formally describe.
Now we're going to test it. Right here.
Here’s the first thing you need to know: your instinct right now will be to think about what you know as a topic. “I know about nursing.” “I know about cars.” “I know about managing people.”
That’s the layer AI handles just fine. Ask it about nursing or cars or management and it’ll give you something decent. Maybe even impressive. The gap doesn’t live thereResearchers catalogued 15 things AI consistently can't do — and tacit knowledge, intuition, and contextual understanding topped the list. These aren't gaps that better models will close. They're structural. ScienceDirect, 2024.
Go deeper. Think about a specific moment — a time you made a call that someone without your experience would have gotten wrong.
Maybe you’re a nurse who looked at a patient and knew something was off before the vitals confirmed it. Maybe you’re a parent who met a daycare worker and felt, in your gut, that something wasn’t right. Maybe you’re a mechanic who heard a noise nobody else heard. A manager who sensed a team was about to fall apart. A teacher who knew a student was in crisis, not lazy. Someone who navigated a system that was designed to make you give up — and found the one path that actually worked.
Whatever it is — that’s your thing. Not the topic. The moment. The judgment call.
Below is a live AI. Describe that situation — the circumstances, the stakes, what you were reading that others wouldn’t have seen. Then ask it what it would have advised. Watch what comes back.
That gap — between what the AI produced and what you know from living — is not a flaw in you. That's the thing about you that has value.
What you just found is judgment. Contextual knowledge. The kind of understanding that doesn't exist in any dataset because it comes from being a person in a specific place with specific experience.
The AI can produce output on any topic. It cannot know whether that output actually works in the real world. You can. That's not a small thing. That's the whole thing.
When 758 consultants used AI in a Harvard Business School study, accuracy jumped 40% on tasks AI handles well — but dropped 23% on tasks requiring the kind of judgment you just demonstrated. The tool can't tell you which type of task you're facing. Read the study
Organizations are realizing there's a growing gap between the high-stakes judgment they need and the pathways to develop it. The gap you just found? That's the one companies are scrambling to fill. Valence, 2026
Now push back. Tell the AI what it got wrong. Give it your context, your constraints, the reality it's missing. Make it try again.
Watch what happens. Does it get closer to your reality? Or does it just generate a more confident version of the same shallow take? Your ability to tell the difference is the skill.
Two things, and both matter.
First: to push back, you had to explain what you know and why it matters. You had to put language around expertise that usually just lives in your head. That’s not a side effect of this exercise. It’s the point. The better you can articulate what you carry, the clearer you see yourself — and the harder it is for anyone or anything to define your value for you.
Second: you can tell the difference between a genuine improvement and a polished version of the same miss. In every organization, every school, every government office, someone is going to trust AI output that shouldn’t be trusted. That ability to see through it has real economic value — but it starts with knowing your own territory well enough to recognize when something’s off.
How many people in your workplace or school would accept AI output at face value on this same topic? What would they miss?
of the U.S. workforce could see at least 10% of their tasks reshaped by AI. That's not a prediction — it's an estimate from OpenAI and University of Pennsylvania researchers. The real question: will people learn to direct AI, or just defer to it? Eloundou et al, Science 2024
Now a different kind of question. This one isn’t about what you know — it’s about how you came to know it.
How did you actually learn the thing you know? Not “I studied it.” The real way. The failures, the mentors, the moments where something clicked because you’d been in the situation enough times to finally see what was happening.
Most of the knowledge that actually matters was built through exposure, practice, consequence, and timePsychologist K. Anders Ericsson spent decades studying how expertise forms. His key finding: it takes structured experience, not just time. You can't shortcut your way to judgment. Deliberate practice. There’s no shortcut.
Tell the AI your story. How did the knowledge get into you?
Exposure
↓
Practice
↓
Failure
↓
Judgment
If AI removes the first three steps, the last one never forms.
Entry-level job postings have dropped ~35% since January 2023. The pipeline that built your expertise — the junior work, the first assignments, the learning-by-doing — is narrowing for those coming after you. CNBC, 2025
You just traced the pipeline that built your judgment. Exposure, practice, failure, time. There’s no shortcut and no dataset that contains it.
Here’s why this matters: that pipeline — the process by which people develop judgment through experience — is exactly what’s being disruptedIf junior work is how people become seniors, and AI absorbs the junior work — where do the next seniors come from? This isn’t hypothetical. It’s already happening in writing, design, and code.. When AI handles the entry-level work, the junior tasks, the first drafts — it also removes the experience that used to build expertise.
Is the path that built your knowledge still available to the next person? Or is it being closed off?
At companies adopting AI, early-career headcount fell 7.7% in just six quarters. Meanwhile, employment for 22–25 year-olds in AI-exposed jobs dropped 13% compared to peers since 2022. The on-ramp is shrinking fast. Fortune, 2025 | Brynjolfsson et al, Stanford 2025
The world is being reorganized. You can either let that happen around you or you can see it for what it is and move.
Time to flip it. Everything up to now, you’ve been evaluating what the AI produces. Now you’re going to direct it.
Think of something you actually need. Not a test — something real. A plan, a document, a solution to a real problem.
Don’t ask. Direct. Tell the AI what you need, give it the context and constraints, and set the standard. If it misses, tell it what’s wrong and make it try again. Iterate until it meets your standard. You’re the one who knows what should exist. Make it build what you see.
Notice the difference. In Level 1, you tested. In Level 2, you pushed back. Just now, you led. You had a vision of what should exist, and you used a powerful tool to build it.
That’s the skill. Not prompting. Not “using AI.” Knowing what should exist and making it real — whether that means directing a tool, or deciding the tool isn’t what this moment needs and doing it yourself. Researchers call this directive capacityThe ability to specify what should exist and hold the output to a standard. It's the skill that scales with AI, not against it. The better the tool, the more this matters. — and it’s the skill that scales with AI, not against it. Sometimes the most powerful move is doubling down on the human thing: the conversation, the intuition, the presence that no tool replicates.
Either way, you’re the one who decided what mattered. That’s not a skill AI gave you. That’s who you are. As one researcher put it, a calculator fills a computational gap, but AI fills a cognitive gap“The calculator fills a computational gap; AI fills a cognitive gap. The former supports learning; the latter can replace it entirely.” Directive capacity is how you keep the cognitive work yours. — Chris Hood, 2025 — and directive capacity is how you keep the cognitive work yours.
One more thing, and it’s the most important thing.
When a new technology makes work faster, someone captures that value. It might be you — same pay, less effort. It might be your employer — more output, same payroll. It might be an investor. It might be a customer.
This is not a technology question. This is the oldest question in economicsEvery productivity revolution — the loom, the assembly line, the computer — created enormous value. The fight was always about who captured it. AI is the same fight, moving faster., and the answer has always depended on who has power and who doesn’t.
The knowledge you carry — the thing you’ve been exploring through this whole experience — is real and it has value. But value and compensation are not the same thingA nurse’s judgment at 3am is invaluable. Their paycheck doesn’t reflect that. AI doesn’t fix this gap — it widens it, unless the people with the knowledge also have the leverage..
Let’s think this through together. In your world — who captures the value when things get more efficient? And what does that mean for you?
From 1979 to 2019, productivity grew 1.36%/year while median pay grew just 0.38%/year. Workers got more productive for 40 years running — and barely saw a raise. That's the pattern AI is about to accelerate. Economic Policy Institute
“There is nothing automatic about new technologies bringing widespread prosperity.” Nobel laureate Daron Acemoglu and Simon Johnson traced 40+ years of automation and found it consistently raised corporate profits without sharing the gains. AI is next in line. Power and Progress
McKinsey estimates generative AI could add $4.4 trillion per year to global corporate profits. The value is being created. The question is who gets it. McKinsey, 2023
If your employer adopted AI tomorrow and your output doubled — would your pay double? Who would capture the difference?
That conversation? That’s what AI is actually for. Not replacing your judgment — extending it. Helping you think through real problems that matter to your life.
You just used AI as a thinking partner to work through a systemic question about your own situation. That’s not a parlor trick. That’s the skill.
So here’s where you are.
You know something AI doesn’t. You’ve tested that — not as a theory, but right here, live, with the tool in front of you. You’ve felt the gap between what it generates and what you know.
You’ve pushed back on it, traced the path that built your knowledge, directed it to make something real, and used it to think through a question that matters to your life.
And you’ve looked at the economic reality: that the value of what you know and the compensation you receive for it are two different things, connected by systems you can either understand or be subject to.
What you just did wasn’t an AI tutorial.
You found the foundation for every AI skill that actually matters: knowing what should exist, recognizing when something’s off, directing a powerful tool toward something real, and thinking through systemic questions with genuine honesty.
The difference between using AI and directing AI is the difference between asking and knowing what to ask for. You just practiced the second one. And you're ahead of the curve — most institutions haven't caught up yet56% of higher education leaders say their institutions are unprepared for AI-driven workforce changes. 59% believe last spring's graduates were unprepared. The system hasn't caught up — which is why what you just did matters. AAC&U / Elon, 2025.
Before you go
What do you see now that you didn’t see before?
Not about AI. About yourself — what you carry, what it’s worth, and how the world around you is or isn’t recognizing it.
And one more thing
What do you need now?
Not what should you learn. What do you actually need? More access to tools? Help with a specific situation? A community? Something we haven’t thought of?
Stay connected
Want to hear when the next door opens?
We send about one email per month — when a new door is ready. That’s it.
Received. Thank you. What you shared shapes what we build next.
Generating your report…