The Age of Copiloting: Why the Workforce Upskilling Conversation Is Missing the Point,
General Eric Shinseki, former Chief of Staff of the United States Army, once issued a warning that has only grown more urgent with time:
“If you don’t like change, you’re going to like irrelevance even less.”
He said that about transforming the U.S. military. But he may as well have been speaking directly to organizational leaders navigating the most disruptive technological shift in a generation.
Artificial intelligence is no longer arriving. It is here, embedded in workflows, reshaping job descriptions, and quietly reordering who adds value and how. According to the World Economic Forum’s Future of Jobs Report 2025, 85 million jobs may be displaced by AI and automation by 2030, while 97 million new roles emerge that require collaboration between humans and intelligent systems. That is not a future-state forecast. That is a transition already in motion.
And yet, the dominant organizational response to this moment is alarmingly narrow.
The Upskilling Conversation We’re Having
Walk into any C-suite discussion about workforce readiness, and you will hear some version of the same three priorities: prompt engineering, tool familiarity, and platform literacy. Organizations are deploying AI workshops, licensing software training platforms, and measuring adoption rates. These investments are not wrong; they are simply insufficient.
The assumption embedded in this approach is that AI readiness is primarily a technical challenge. Teach people the tools, the thinking goes, and they will adapt. Efficiency will follow. Competitive advantage will follow.
But this assumption misreads where human value actually lives in an AI-augmented workforce. That misreading is becoming a strategic liability.
The Copilot Imperative
The most useful reframe for this moment is not “humans vs. AI.” It is humans with AI, a relationship best understood through the metaphor of copiloting.
In aviation, copiloting is not a passive role. The copilot does not simply monitor while the aircraft flies itself. The copilot interprets information, exercises judgment, communicates under pressure, manages contingencies the autopilot was not designed to anticipate, and keeps the crew and passengers focused on the mission. The copilot brings something the instrument panel cannot: situational awareness, adaptive reasoning, and the irreplaceable ability to read a room.
The workforce of the next decade will be organized around this same dynamic. AI will handle analysis, synthesis, pattern recognition, and scaled execution at speeds no human can match. But the humans who thrive, and the organizations they serve, will be those who understand what AI genuinely cannot do and who have invested deliberately in those capacities.
The question organizational leaders must answer is not how do we get our people to use AI? It is how we develop human capabilities that make AI use transformative rather than merely efficient?
The answer requires understanding two distinct but interdependent categories of human capability: power skills and generative skills. Most organizations are underinvesting in both.
Power Skills: The Human Side of Human-AI Integration
Power skills are the interpersonal and emotional capabilities that govern how humans relate to, influence, and lead one another. They are not soft. They are not secondary. In an AI-augmented workforce, they are load-bearing.
AI can simulate the language of empathy. It can mirror an encouraging tone. But it cannot build genuine trust, navigate the emotional undercurrents of a high-stakes conversation, or inspire a team to move through uncertainty toward a vision they believe in. Those capacities are irreducibly human, and their organizational value intensifies as AI handles more of the cognitive heavy lifting.
The most critical power skills for effective copiloting include:
Emotional intelligence: the capacity to understand and manage one’s own emotions while reading and influencing the emotions of others. It is the argument at the center of Forging Emotionally Intelligent Leaders in the Age of AI that emotional intelligence is not a supplement to strong leadership; it is the foundation of it. A 2024 study published in the Harvard Business Review found that EQ-related competencies accounted for nearly 90% of the difference between high performers and their peers in AI-integrated work environments.
Trust-building and relational leadership: the ability to create the psychological safety that makes people willing to take risks, voice dissent, and commit to change. No algorithm can earn that. It is built through presence, consistency, and genuine human regard.
Change mobilization: the ability to move people. To build coalitions. To translate a strategic vision into a shared sense of purpose that motivates action even in the face of disruption. Organizational transformation powered by AI still depends entirely on humans willing to follow a leader into unfamiliar territory.
Organizations that treat these capabilities as personality traits rather than developable skills will find themselves with powerful AI tools and deeply limited human infrastructure to deploy them effectively.
Generative Skills: The Creative Edge AI Cannot Replicate
If power skills define how humans lead, generative skills define how humans think. They are the cognitive abilities that allow humans to originate ideas, challenge assumptions, and produce novel solutions that did not previously exist.
This is where the distinction matters most and where the dominant upskilling narrative falls shortest. AI can analyze. AI can predict. AI can synthesize information at extraordinary scale. But AI does not define the problem worth solving. It does not challenge the premise of the question. It does not imagine an alternative that has never been tried.
Those activities are irreducibly human, and they are precisely what separate organizations that use AI from organizations that are transformed by it.
The generative skills most essential to effective copiloting include:
Contextual judgment: the ability to read a situation with nuance, distinguish what the data says from what it means, and make decisions that account for factors no algorithm was trained to weigh. McKinsey’s research on AI adoption consistently finds that the organizations capturing the most value from AI are those with leaders who can frame the right questions, not just interpret the right answers.
Ethical reasoning: the ability to interrogate not just what AI can do, but what it should do, and why. As organizations deploy AI in hiring, customer service, risk assessment, and operations, the humans who can ask and answer hard moral questions about bias, fairness, accountability, and consequences will carry disproportionate importance within organizations.
Creative synthesis: the capacity to connect disparate ideas across domains, generate genuinely novel approaches, and envision futures that do not yet exist. Contrary to popular anxiety, AI does not make human creativity obsolete. It raises the ceiling for what creative humans can accomplish, but only for those who show up to the collaboration with something original to contribute.
Problem framing and assumption challenging: the ability to interrogate the question before answering it. AI optimizes within the frame it is given. The humans who define that frame and have the intellectual courage to challenge it are the ones who determine whether AI is being applied to the right problems in the first place.
What This Demands of Organizational Leaders
Power skills and generative skills are distinct but not independent. The most effective copilots in the age of AI will be those who can originate a bold idea, stress-test the assumptions underneath it, and then mobilize people around the change required to realize it. That is a complete human being — emotionally grounded, cognitively generative, and deeply purposeful. Developing that kind of talent is a leadership imperative.
And yet most organizations today are measuring and rewarding the wrong things. They track AI tool adoption rates. They celebrate productivity gains. They benchmark against competitors on technical capability. What they are not systematically doing is developing, assessing, or rewarding the human capacities that determine whether any of those technical investments actually produce transformative outcomes.
The result is a dangerous imbalance: organizations becoming more efficient at executing AI-generated outputs while progressively hollowing out the human judgment required to direct, challenge, and apply those outputs wisely.
Deloitte’s 2024 Global Human Capital Trends report found that while 79% of executives view AI as critical to their organization’s future, fewer than 30% report having a clear strategy for developing the human skills needed to complement it. That is not a technology gap. That is a leadership gap.
Closing it requires three commitments:
Redefine what upskilling means. Technical fluency is a floor, not a ceiling. Organizations must build learning ecosystems that develop power skills and generative skills alongside platform proficiency, and in many cases, ahead of it.
Redesign how work is structured. If every workflow is optimized purely for speed and output, both power skills and generative skills atrophy. Leaders must create the conditions that allow human judgment and human connection to be exercised, stretched, and developed: protected time, cross-functional collaboration, and deliberate friction.
Rethink how performance is measured. What gets measured gets developed. If organizations continue to evaluate performance primarily on efficiency metrics, they will systematically underinvest in the human capabilities that create durable competitive advantage. New evaluation frameworks must make room for contribution quality, creative problem-solving, and leadership impact.
The Investment We Cannot Afford to Delay
The pace of AI adoption across industries is not slowing. According to Stanford’s AI Index Report 2024, AI patent filings have increased by more than 1,600% over the past decade, and enterprise AI investment surpassed $90 billion globally in 2023 alone. IDC projects that figure will more than triple to $307 billion in 2025 and reach $632 billion by 2028. Gartner forecasts that 40% of enterprise applications will be integrated with AI agents by the end of 2026, up from less than 5% today. The infrastructure is being built at a scale and speed that leaves little room for organizational hesitation.
When Block announced the layoff of nearly half its workforce in February 2026, CEO Jack Dorsey was unambiguous about the cause. “Intelligence tools have changed what it means to build and run a company,” he wrote to shareholders. “A significantly smaller team using the tools can do more and do it better. I don’t think we’re early to this realization. I think most companies are late.” Dorsey is right that the efficiency imperative is real. What his statement leaves unanswered, and what organizational leaders cannot afford to leave unanswered, is what happens to the quality of judgment, creativity, and human leadership inside those smaller, faster teams.
The organizations that delay investment in human capacity while their competitors develop it will find themselves with highly automated operations and profoundly limited ability to direct them well.
This is the central paradox of the Age of Copiloting: the more capable AI becomes, the more valuable human power skills and generative skills become. And yet those skills do not emerge on their own. They are built. They are cultivated. They require intentional leadership and sustained institutional commitment.
General Shinseki’s warning about irrelevance was directed at military leaders resistant to transformation. But the underlying truth is universal. In a world being rapidly reshaped by artificial intelligence, relevance is not a function of adopting the right tools. It is a function of developing the right people.
The organizations that will define the next decade are not those with the most sophisticated AI implementations. They are those guided by the wisest humans.
The copilot seat is open. The question is whether your people and your culture are ready to take it.