AmplifyFE community space

Balancing AI and humanity: How technology can empower staff, not diminish them

By Simon Takel, Exeter College

The rapid emergence of generative AI in education has been anything but subtle. As Patel
(2023) observes, “AI has transitioned from an obscure niche field to a mainstream
phenomenon. The discourse around generative AI has expanded explosively, with new and
remarkable solutions unveiled daily.” For many educators, however, this pace of change has
not felt empowering. Instead, it has introduced a new layer of uncertainty: Am I allowed to
use this? Am I using it properly? Will this make my role less valuable?


In my experience, these questions matter far more than the tools themselves. AI does not
empower staff by default. It can just as easily amplify anxiety, erode confidence, or reinforce
existing pressures if it is introduced without care. The real question for the FE sector is not
whether we adopt AI, but how we do so in ways that restore agency, protect professional
judgement and create space for the deeply human work that education depends on.


Empowerment is about agency, not efficiency alone


Much of the current discussion around AI in education focuses on productivity: faster
planning, automated feedback, streamlined administration. There are some real benefits
here, particularly in a sector under sustained workload pressure, that should not be
dismissed. But efficiency on its own does not equate to empowerment.


For staff, empowerment is more closely linked to agency: feeling trusted to make decisions,
confident to experiment, and supported rather than monitored. It is also shaped by emotional
intelligence: our capacity to understand our own responses to change, empathise with
others, and navigate uncertainty together. As Ray (2018) reminds us, “as more and more
artificial intelligence is entering into the world, more and more emotional intelligence must
enter into leadership.”


AI can support this balance, but only if it is framed as a tool that enhances professional
judgement, not one that replaces it. Used well, AI can reduce low-value cognitive load and
free up time for relational work: mentoring learners, collaborating with colleagues, reflecting
on practice. Used poorly, it risks becoming another performative demand layered onto
already stretched roles.


Where AI can genuinely empower staff


In practice, I have seen AI be most empowering when it helps staff reduce friction rather
than dictate practice. This includes supporting idea generation, structuring planning,
suggesting alternative approaches, or helping educators think through their challenges more
efficiently. In these moments, AI acts as a thinking partner rather than a decision-maker.
Crucially, the empowerment here does not come from the technology itself, but from the
permission structure around its use. When staff feel they must hide their use of AI or
constantly justify it, empowerment quickly evaporates. Ambiguity breeds anxiety. Clarity, by
contrast, creates psychological safety and encourages thoughtful experimentation.

Why AI shouldn’t be writing whole lessons for us

This is also where I remain deliberately cautious. While AI can be a valuable planning
support, I do not believe educators should be using it to generate entire lessons. Teaching is
not simply the delivery of content; effective lessons are shaped by professional judgement,
contextual knowledge, and real-time responsiveness to learners.


High-quality lessons draw on an educator’s understanding of learners’ prior knowledge,
emotional state, group dynamics and likely misconceptions. These are deeply human
considerations, developed through experience and relationship. When AI generates a lesson
wholesale, that professional thinking is displaced rather than supported.


In my experience, lessons created in this way are unlikely to be the most effective ones.
They may appear polished, but they often lack intentionality, coherence and adaptability.
More significantly, they remove educators from the cognitive and creative process of lesson
design; a process that is central to professional identity and pedagogical effectiveness.
The distinction between supporting thinking and replacing thinking matters here. AI can
prompt ideas, suggest structures, and reduce cognitive load, but pedagogy should remain
firmly in human hands. As Savion (2023) notes, “much like the Goldilocks principle, learning
leaders face the challenge of finding the perfect balance between the best use of generative
AI and emotional intelligence.” Finding that balance is not optional; it is essential if AI is to
empower rather than deskill.


The leadership dimension: culture before capability

This brings the focus squarely back to leadership. AI adoption is not a technical challenge
alone; it is a cultural one. Leaders shape whether AI is experienced as a threat or an
opportunity through the conditions they create.


In my own leadership and teacher-development work, I have found that staff confidence with
AI is less about training sessions and more about modelling, trust and values. When leaders
are open about their own learning, acknowledge uncertainty, and frame AI as something to
be explored collectively, it creates psychological safety. When AI is introduced primarily
through policy, surveillance or risk language, it does the opposite.


The parallels with emotional intelligence are clear. High-EI leadership creates environments
where people feel safe to ask questions, challenge assumptions and try new approaches.
Without that foundation, even the most sophisticated AI tools are unlikely to feel
empowering.


A small but telling moment

One of the most revealing moments for me came not in a formal training session, but in a
casual conversation with a colleague. They described using AI to help structure a lesson
sequence, then quickly added, “I hope that’s okay to admit.”
That hesitation spoke volumes. The issue was not capability, but permission. Once we talked
through how AI could support (rather than undermine) their professional judgement, the tone
shifted from defensiveness to curiosity. The tool hadn’t changed, but the conditions around
its use had.


Why the human work matters more than ever

As AI becomes more capable, the uniquely human aspects of education become more
visible, not less. Empathy, relational trust, ethical judgement and adaptability are not things
AI can replicate. In fact, the more technology advances, the more these qualities matter.
This is why balancing AI with emotional intelligence is not a soft add-on, but a strategic
necessity. AI may help us work faster, but EI helps us work better with learners, with
colleagues, and with ourselves.


Designing for empowerment

If AI is to genuinely empower staff in FE, it must be introduced with intention. Leaders need
to ask not just what can this tool do? but what pressures does this remove, and which might
it add? How does it support staff agency, confidence and professional judgement?
Ultimately, empowerment is not something AI delivers. It is something leaders and
organisations design for. Technology can support that design, but it cannot replace the
human judgement at its heart.


Are the ways we are introducing AI helping educators think more deeply about their practice,
or quietly doing the thinking for them?

Reference

Patel, D. (2023). Artificial Intelligence & Generative AI For Beginners.

Ray, A. (2018). Compassionate Artificial Intelligence: Frameworks and Algorithms. Compassionate AI Lab.

Savion, S. (2023) The Goldilocks Principle: Striking “Just the Right Balance” Between AI and EI, Training Industry, 18 September. Available at: The Goldilocks Principle: Balancing AI and EI (trainingindustry.com) (Accessed: 04/03/2024).


Simon Takel is an Advanced Teaching Practitioner and Course Leader in Leadership and Management at Exeter College. He works extensively with educators and leaders on professional development, wellbeing, and the thoughtful use of AI in teaching and leadership. Simon is a Chartered Management Fellow, a member of the ETF Practitioner Advisory Group, and is currently undertaking a practitioner research PhD into sustainable models of professional learning in FE.

Get in touch with Simon

Leave a Reply

Your email address will not be published. Required fields are marked *