Leadership in the Digital Age: AI, Human Agency, and What Your People Need
Image from Unsplash+
Leadership today takes place in conditions that are faster and more mentally demanding than in the past. Communication is constant, work is increasingly digital, and decisions are often made in public channels with little time to pause or reflect.
Many conversations about digital leadership are superficial, practical: inbox management, Zoom fatigue, hybrid-working protocols. The deeper questions concern what happens to human agency, motivation, and meaning when the conditions of work are being reshaped by AI, automation, and surveillance — often faster than leaders can think clearly about it.
As a cyberpsychologist, I've spent two decades studying how people and their technologies shape each other. More recently, as work in so many sectors (including my own) changes rapidly and radically, I've turned my attention to what that means for agency, motivation, and the felt experience of work. What happens to the felt experience of work when you're always on, always visible, always monitored? What happens to your people's desire to show up when AI is framed as the answer to everything, and human contribution starts to feel like an afterthought? We should see these as the leadership questions they are.
Key Takeaways
Leadership in the digital age is shaped by the rise of AI, employee surveillance, and a relentless cult of productivity — forces that can erode the conditions people need to thrive.
Digital pressure pushes leaders toward reactive decision-making. But AI adoption pressures can push entire organisations toward something worse: the slow death of human desire and agency at work.
What people need from work — mastery, autonomy, connection, meaning — doesn't change because the tools do. Leaders who forget this risk creating bystanders rather than contributors.
Personality, life stage, organisational position, and beliefs about technology all influence how people experience digital change. The cheerful AI-empowerment story is told from the top; the view from further down looks different.
The dominant AI narrative focuses on what humans can do. Effective digital leadership requires attending to what humans need.
Resistance to technology isn't always a problem to be solved. Sometimes it's the most values-aligned response available.
As you read, keep in mind that I speak at conferences, leadership events, and corporate retreats in the UK, Europe, and internationally on these themes. Drawing on the Agentic Organizations work launched at Davos 2026 and two decades of expertise in cyberpsychology, my talks challenge leaders to question inevitability narratives and explore what it means to preserve human agency, desire, and meaning in an age of increasing automation. Explore my AI at Work speaking offering here, or get in touch.
More of my work on AI, digital wellbeing, and the psychology of work:
Agentic AI and Human Agency: A Cyberpsychologist's Perspective
A Cyberpsychologist's Perspective on Agentic AI — in Fact, a Rebuttal — my essay for the House of Beautiful Business on agency, AI adoption, and why questioning the dominant narrative matters
Reset: Rethinking Your Digital World for a Happier Life — my book on navigating technology with agency and awareness, including the psychology of midlife and technological change
What Makes Leadership Harder in the Digital Age
Leadership has always involved pressure, uncertainty, and responsibility. What has changed is the environment in which those demands are experienced, and the speed at which that environment is shifting.
Always-on communication and the shrinking of thought
Digital work rarely allows for uninterrupted thinking. Messages arrive across multiple platforms throughout the day, often with an implicit expectation of quick response. Even when leaders are not actively replying, their attention is pulled toward what might need responding to next.
This constant interruption makes it harder to think deeply, hold complexity, or reflect before acting. Over time, leadership drifts from judgement toward managing incoming demands. Digital systems reward speed: quick replies are visible, silence is noticeable, and delay is often interpreted as disengagement. Reactive leadership narrows attention and reduces agency. Deliberate leadership requires the ability to pause, reflect, and choose a response aligned with values and intention. When that pause disappears, leadership becomes driven by urgency rather than judgement.
Remote work and the loss of informal signals
Remote and hybrid working environments reduce the informal moments that once helped leaders sense how things were going. Casual conversations, shared physical space, and spontaneous feedback are harder to replicate digitally. Without these cues, leaders have less information to work with and more uncertainty to manage.
A subtler loss runs beneath this. The people who struggled most during the pandemic's sudden shift to remote work weren't necessarily those who lacked technical skills. They were often those whose personality and beliefs about technology — particularly what I and other researchers have called PROI, the perceived reality of online interaction — led them to assume that digitally mediated contact would be inherently inferior. If you believe that a conversation through a screen isn't a real conversation, you'll struggle to lead through one, and you'll struggle to feel led through one. High-PROI people focus on how they can use technology to get their needs met. Low-PROI people assume the technology will condemn them to an inferior experience. Neither position is inherently right or wrong — but both can be unreflective. The enthusiastic adopter who never questions what's lost in the shift is exercising no more agency than the resistor who refuses to examine their assumptions.
Leaders in the digital age need to understand that their teams contain people all along this spectrum — and that these attitudes aren't generational. The idea that older people resist technology and younger people embrace it has largely fallen apart. Uptake of technology and attitudes about it vary within every age group and are better predictors than age for how someone responds to digital change.
The cult of efficiency, productivity, and optimisation
We're leading inside a culture obsessed with efficiency, productivity, and optimisation.
When I checked Google Trends while writing my essay for House of Beautiful Business on agentic AI, I found that search interest in 'efficiency,' 'productivity,' and 'optimisation' had hit simultaneous 20-year historic highs. AI is constantly promoted as the way to achieve these now-mandatory values. The drumbeat makes resistance feel not just futile but stupid.
For leaders, this creates a particular bind. You're expected to adopt AI, integrate it fast, and go as big as you can with it. The dominant narrative contains three themes that amount to coercion: inevitability (AI is unstoppable), survival (adopt or be eliminated), and moral imperative (don't let down your shareholders, your customers, your team). Anxiety-driven decision-making — whether it drives you to adopt without thinking or resist without examining — isn't meaningful human agency. It's reactivity dressed up as strategy.
How AI Changes the Psychological Landscape of Work
The death of desire at work
The pitch is familiar: AI can cover the grunt work, freeing humans to innovate and strategise while boosting productivity, efficiency, and the bottom line. Plausible enough, but it omits something central to our humanity.
Motivation to work requires something almost entirely absent from the AI-adoption narrative: desire. For each of us, the desire profile is different. We might crave passion, mastery, recognition, connection, involvement, stimulation, or job security. We feel driven to fulfil core psychological needs like attachment, autonomy, and play.
Therapists sometimes use behavioural activation as an intervention for depression. A depressed client schedules two kinds of activities: things that give them pleasure, and things that give them a sense of mastery. We need both to thrive. Sometimes their absence is what causes a person to despair in the first place. Work, at its best, provides both: the satisfaction of seeing something through, the growth that comes from struggle, the pride of having achieved an outcome, perhaps with great difficulty along the way.
Agentic AI risks automating away exactly the parts of work that matter most to human wellbeing, including what I'd call meaningful friction: the challenges and difficulties that build competence, forge identity, and make us feel engaged. When AI takes over the 'doing', when workers feel less involvement in processes and less ownership of outcomes, both their cognitive and emotional stance shifts. They may be watching, but they're not participating in the same way. They may be reviewing or monitoring, but they're not creating.
The bystander effect in AI-augmented teams
Every psychology undergraduate learns about the bystander effect: when responsibility is diffused or unclear, or taken away from people, they disengage. They notice less, care less, act less. What's the point in paying attention if something doesn't seem to have much to do with you?
When agency is assigned to AI, and when we reify that contribution and that type of intelligence as superior to that of humans, we risk turning human workers into bystanders in their own jobs, deferent to the machine. Leaders need to understand this dynamic, because it will not show up as resistance or complaint. It will show up as quiet disengagement, a slow fading of initiative, and a workforce that's technically present but psychologically checked out. And leaders who are themselves deferring to AI — using it to draft communications, make decisions, or set strategy without sufficient human reflection — may be modelling exactly the bystander behaviour they should be guarding against.
Surveillance and the erosion of trust
In the wake of the pandemic, as remote work became widespread, many employers turned to digital surveillance to keep tabs on their people. Eight out of ten private companies in the US now use technology to capture productivity metrics, frequently in real time.
The results have been counterproductive in ways that should concern any leader. Research reported in Harvard Business Review found that monitored employees were more likely to ignore instructions, damage property at work, pilfer office equipment, sneak unauthorised breaks, and work slowly on purpose. In a designed experiment, employees informed that their work would be monitored were more likely to cheat when given the opportunity. The researchers concluded that surveillance causes people to subconsciously feel less responsible for their own conduct.
Being monitored is, essentially, not being trusted. And not being trusted erodes exactly the sense of agency and accountability that good work depends on. This is particularly corrosive for people in vocational or creative roles. I wrote in Reset about hospice chaplains in Minnesota who were subjected to productivity tracking that assigned point values to their work with dying people: a visit to a dying individual garnered a point, while a phone call to a grieving family was worth a quarter of a point. Whatever points might be sacrificed, their consciences constrained them from prematurely leaving the sides of dying people who needed them. In the end, they quit. If that's what their profession was evolving into, they couldn't evolve with it.
Surveillance works only if the work it's surveilling is performed on a digital device. If there aren't keystrokes to count, mouse movements to tabulate, and on-screen presence to measure — if it doesn't happen on a connected device — it doesn't count as work. For leaders, this reveals a fundamental philosophical question: are your people a bunch of clicks, or are they human beings with a calling?
What Leaders Need to Understand About People and Technological Change
Personality isn't destiny, but it's not nothing
Where you sit on the Openness to Experience dimension of personality significantly affects your response to change. During the pandemic lockdowns, I watched this play out vividly. Many of my therapy clients pivoted to online sessions with curiosity and then gratitude about how well it could work. Others refused help during lockdown because it was being delivered online, opting to wait for the 'return to normal' despite being in distress. Some of my fellow psychologists resisted moving their practices online for similar reasons, insisting that the technology would prevent 'real relating' — despite research demonstrating exactly the opposite.
Personality is maximally malleable in childhood, but it becomes more stable as we move through our twenties and thirties. During midlife, personality reaches 'peak stability', particularly Openness to Experience. If you enter your forties as someone who's typically cautious about new things, possessed of a traditional mindset and more comfortable with the familiar, those features become yet more fixed in your middle years.
For leaders, this means that when you're asking a team to adopt new technology — whether it's AI tools, new communication platforms, or new ways of working — you're asking people to do something that is psychologically easier for some than for others, and this has nothing to do with capability or intelligence.
Midlife, work, and the crisis we weren't prepared for
The influential lifecycle theorist Erik Erikson said that the challenge of middle adulthood is 'Generativity vs Stagnation.' But he built his theory during the 1950s, when 'jobs for life' were still a thing. The career counsellors of yesterday did not prepare the workers of today for the gig economy, multi-hyphenate lifestyles, or technological innovations so dizzying that, in the prime of their working years, huge numbers of them would need to upskill, pivot, or reinvent their careers at the developmental stage when people tend to be least psychologically flexible.
At this moment, the whole world of work is having its own midlife crisis. For someone going through their forties, fifties, or early sixties at this point in history, these wider-scale disruptions can either stoke excitement or feel like double trouble. Forget generational labels and stereotypes. This is developmental psychology colliding with a technological revolution no career counsellor could have anticipated.
Where you sit shapes what you see
Personality and life stage shape how people respond to change. But so does something more structural: where you are in the hierarchy.
The Agentic Organizations report I contributed to — published in January 2026 by Hotwire, ROI·DNA, and the House of Beautiful Business, based on a survey of 900 professionals across the US, Europe, and Singapore — exposed a gap that should give any leader pause. Three quarters of senior leaders reported feeling more empowered by AI. Among specialists and junior team members, fewer than half said the same. On creativity, the disparity was sharper still: most executives felt AI was making them more creative, while only about a third of junior employees agreed.
Psychologically, the pattern makes sense. Senior leaders use AI as a thinking partner — to explore ideas, pull together strategy, test scenarios. It extends their reach. Early-career employees are more likely to watch AI absorb the tasks they were hired to do: the execution, the delivery, the work through which they were supposed to build competence and prove themselves. Same technology. Different psychological experience, depending on whether it's amplifying your existing power or consuming the ground you were standing on.
The report also found that more than half of respondents believed AI could handle most or all of their job within five years. Executives felt this most acutely — nearly seven in ten — but for them the prospect felt energising, even freeing. For people lower down, the identical belief carried a different emotional charge. And for the youngest workers in roles most exposed to automation, the data is already concrete: employment among 22- to 25-year-olds in AI-adjacent fields has declined, particularly in tech support, administration, customer service, and retail.
The cheerful AI-adoption narrative cracks under this weight. The story that AI empowers everyone — that it liberates people to do more meaningful work — is told from a position of security, by people whose agency was already substantial enough to share. For those who never had much organisational power to begin with, 'shared agency' can feel less like collaboration and more like erasure. The view from the C-suite and the view from the entry-level desk are not the same view. The most anxious people in your organisation aren't catastrophising. Junior employees don't have the security that seniority provides, and AI is one more reason why.
Resistance isn't always the problem
Most digital leadership advice assumes that resistance to technology is a barrier to be overcome. Sometimes it's the clearest signal of what someone values.
The hospice chaplains who quit rather than reduce sacred work to productivity metrics weren't being inflexible. My father, a doctor who retired rather than watch medicine become reduced to codes and transactions, wasn't being a Luddite. Whether you're a low-PROI neo-Luddite or a high-PROI technophile, there's no right or wrong in embracing or resisting technological innovation. What counts is what's workable and values-aligned for you, and how honest you're being with yourself about both of those things. When something doesn't feel right about your relationship with technology at work, maybe resistance makes the most sense.
Leaders who pathologise all resistance as a problem to be managed will miss the signal in the noise: sometimes the people pushing back are the ones who can still see what's being lost.
What Effective Digital Leadership Looks Like
Protect desire
Effective leaders in the digital age need to protect their people's desire to work. That means ensuring that AI adoption doesn't automate away the meaningful friction — the mastery, the struggle, the creative satisfaction — that makes people want to show up.
Lead with felt accountability, not surveillance
A computer can never be held accountable. That insight from a 1979 IBM training manual hasn't aged. Felt accountability — care, investment, personal stake — is uniquely human. It's what makes someone stay late or work hard because they care about the outcome and feel truly responsible for and trusted with it, not because a productivity tracker is counting their keystrokes. Leaders who rely on surveillance via digital technologies, for example, erode the very accountability they're trying to enforce. Leaders who model felt accountability — who are visibly invested, who care about outcomes for human reasons, who take responsibility when things go wrong — build cultures where people don't need to be monitored.
Question the inevitability narrative
The most important leadership skill in the digital age might be the willingness to slow down when everyone else is speeding up. The AI-acceleration message is relentless: adopt now, integrate fast, go big. But as I found at Davos in January 2026, where slogans about AI were writ large on every corporate installation, once AI systems are adopted and integrated, they are extraordinarily difficult to unwind. How can you retrofit human-centred ethical frameworks onto what's already been embedded?
The time to consider what you want to preserve about human work — and the things that make it human — is beforeubiquitous AI adoption, not after. Organisations that dare to slow down, reflect, and question are choosing conscious agency over reactive adaptation. Slow leadership like this could be argued to reflect wisdom and human-centredness rather than timidity or lagging behind the times.
Model what you want to see
Leaders shape norms through what they tolerate and model. If you're always reacting, always available, always deferring to what the AI suggests, your team will mirror that. If you demonstrate boundaries, prioritisation, and the willingness to sit with uncertainty rather than optimise it away, you signal that sustained attention and thoughtful work are valued.
Questions worth sitting with
When I wrote about agentic AI and human agency for the House of Beautiful Business, I developed a set of reflective questions for leaders navigating AI adoption. They're worth asking before, during, and after any significant technology decision:
When you talk about AI adoption internally, whose agenda are you communicating — and whose interests are being centred? Whose are being ignored?
Are you implementing technological solutions for things you never considered problems until recently?
What success metrics beyond efficiency, productivity, and growth are you courageous enough to allow to matter?
If you weren't worried about keeping up with the competition, would you be embracing these tools in the same way, to the same extent?
How can you make space for the kind of slow, deliberate thinking that weighs less visible, longer-term costs and consequences?
What This Means for Leaders Today
Leadership in the digital age comes with pressures that are easy to underestimate. The most consequential of them: whether you're leading in a way that preserves the conditions your people need to be engaged, motivated, and human.
Even in the most uncertain or difficult moments, there is always a greater range of internal and external responses available than we imagine. Whether you're naturally a cautious neo-Luddite or an eager technophile, you always have more freedom to respond than might initially appear. You can adopt thoughtfully. You can refuse strategically. You can insist on conditions. You can demand evidence before committing. The one response that doesn't serve anyone is passive compliance driven by fear or hype.
But equally, when something about the way technology is reshaping work doesn't feel right — when it feels like it's eroding meaning, agency, or humanity — the freedom to resist, to question, to slow down, is also yours. Embracing that freedom is wise, courageous, and perhaps even rebellious leadership, and we need it now.
Frequently Asked Questions About Leadership in the Digital Age
Is digital leadership mainly a generational issue?
No. The idea that older people resist technology and younger people embrace it has largely fallen apart. Research shows that uptake of technology and beliefs about it — including what's known as PROI, the perceived reality of online interaction — vary within every age group and are better predictors than age for how someone navigates digital change. Leaders of all generations can struggle with the psychological demands of always-on, AI-augmented work.
Why do capable leaders feel more reactive in digital workplaces?
Digital environments reduce opportunities for pause and reflection. When leaders are constantly interrupted and expected to respond quickly, thinking becomes compressed. But beyond this, the dominant AI-adoption narrative creates additional pressure: the sense that you must adopt, integrate, and optimise at speed or be left behind. This compounds the reactivity. The antidote isn't better time management — it's the willingness to question whether speed is serving your organisation's deeper interests.
What's the biggest thing organisations underestimate about digital leadership?
Many organisations focus on tools, efficiency, and communication norms while underestimating the psychological impact of sustained digital pressure and AI adoption on their people. The biggest blind spot is often desire: what makes people want to work, and what happens to motivation, mastery, and meaning when technology automates away the parts of work that people find most satisfying. Attention, decision fatigue, and the emotional tone set by leadership behaviour all play a role — but the erosion of human agency and felt accountability is the deeper threat.
Should leaders always encourage their teams to embrace new technology?
Not always. While curiosity and openness to experience are generally adaptive, blanket enthusiasm for every new tool can be its own form of reactivity. Sometimes the people who resist a new technology are the ones who can see most clearly what's being lost. Effective leadership means creating the conditions for honest conversation about what serves your people and your values — not assuming that adoption is always progress.
How does AI adoption affect employee motivation and engagement?
When AI takes over tasks that previously gave people a sense of mastery, ownership, and creative satisfaction, people disengage — they no longer feel meaningfully involved in the process or outcome. Leaders need to ensure that AI augments human capability without stripping out the challenge, struggle, and pride of accomplishment that keep people psychologically invested in their work.
Does AI affect senior and junior employees differently?
Yes — and the gap is significant. Research from the Agentic Organizations report (2026) found that senior leaders were far more likely to feel empowered and creatively enhanced by AI, while junior employees reported more anxiety and less sense of creative benefit. Senior people tend to use AI for strategic thinking; junior people are more likely to see it absorb the tasks through which they build competence. Leaders who assume AI is experienced the same way at every level of the organisation are missing a crucial part of the picture.