The critical shift for leaders therefore is not simply to ask, "What do we want AI to do?" but a deeper question: "What do we need AI to be?"
Every organisation deploying AI is making a choice, consciously or not, about the role it will play in its culture, identity, and operations. The leadership imperative is whether leaders have the capability and character to preserve the human elements that give their organisation its soul.
NASA's experience with AI-designed spacecraft components illustrates this balance.
Their algorithms generated parts ten times more reliable than human-designed equivalents, but only because engineers maintained constraints and validated outputs. As NASA noted, without human oversight, algorithms can sometimes make structures too thin. This can be described as AI acting as a creative disruptor in a constructive way: innovation guided by wisdom, not chaos masquerading as progress.
When AI drifts into shadow
Consider an organisation that deploys AI for performance management — seemingly a straightforward desire for efficiency. The system begins by automating routine evaluations, freeing up managers for higher-value work.
But without conscious oversight, it drifts, becoming an autonomous decision-maker that disciplines workers with no human review, while simultaneously monitoring every interaction under the guise of "supporting" employee engagement.
Employees feel under surveillance rather than supported. Trust erodes. Innovation stalls. The AI hasn't malfunctioned — it's operating exactly as designed. But its role has shifted from servant to overseer, and the organisation's core — its soul — has been diminished.
This shift doesn’t happen overnight. It emerges through small acts of delegation, over (or under) confidence, and unexamined convenience.
The pattern repeats across industries. An AI communications tool becomes a generator of polished but hollow messaging. A process analysis system meant to surface insights begins encoding organisational dysfunction. An architectural design assistant creates dependency, eroding the very skills it was meant to augment.
Most organisations don't fail at AI because of technical problems. They fail because AI drifts into shadow roles without anyone noticing until significant damage is done.