Is AI our new God? Reflections on launching HUMAIN
- Dean J. Hill
- Jan 22
- 4 min read
Updated: Jan 25
When I first considered launching an interdisciplinary network to bring people together from across the humanities and technology sector, I honestly didn't expect much interest. I remember asking: is there even a demand for this? Yet, as I began to shape the programme of monthly seminars and share the idea with those I thought might want to join me on this ride, things started to come together in ways I hadn't anticipated. Looking ahead to the rest of the year, I'm hoping for searching discussion and deep, meaningful debate that truly bridges artificial intelligence and the humanities; for conversations that illuminate not only what AI is doing to us, but what it reveals about who we are - and what kind of future we want HUMAIN to help imagine. As I noted at the start, the aim of this network is to foreground the critical reality that the future of technology cannot be shaped by STEM disciplines alone, and that the arts and humanities have an essential role to play in interrogating the ethical and human consequences of AI technologies.
For the inaugural lecture - Is AI our new God? - I was fortunate to be able to call on two highly respected and generous colleagues. The event opened with Dr Ali‑Reza Bhojani, Associate Professor in Islamic Ethics, who brings multidisciplinary approaches to Islamic intellectual traditions. Crucially for this discussion, he's also a Research Fellow at the Seattle‑based non‑profit AI and Faith, which equips people to bring faith‑based wisdom into conversations about ethical AI. His recent work could not be more timely, including his article Truth and Regret: Large Language Models, the Quran, and Misinformation and his chapter Between Fear and Hope: AI Ethics in Islamic Thought.
Speaking second was Dr Jeremy Kidwell, Associate Professor in Philosophical Theology, who brought a distinctive trajectory to this debate: before entering academia, he worked extensively in the tech industry, moving from teenage hacker to IT technician, server administrator, network security team lead and trainer for a telecommunications company. Alongside his theoretical interest in the philosophical implications of AI for human subjectivity, Jeremy is also regularly engaged in hands‑on experimentation with Large Language Models (LLMs), exploring their potential at the boundaries of wellbeing, assistive technology and forms of companionship.
Hosting this event alongside two experts in their respective fields left me with a profound sense that we're only just beginning to ask the right questions about AI, ethics and what it means to be human. And I was especially struck by the diversity in the (virtual) room: colleagues from across the UK and around the world, joining from a wide range of disciplines and professional backgrounds. Before going any further, I want to thank them all for their intellectually stimulating questions, reflections and contributions, which shaped the conversation as much as anything on the formal programme.
After brief introductions, our speakers each offered short but fundamental provocations that set the tone for the evening, before we moved into a more open conversation where I was able to press a little further into some recurring themes. I began by noting that technology is never developing in a vacuum; it touches everything, yet we so often approach AI only through engineering or legal frameworks. Playing devil’s advocate, I asked why theology should matter here at all: if I'm a secular policymaker or working in a non‑religious institution, what does a theological lens reveal about artificial intelligence that a standard ethical framework might miss?
This led into a question about whether theology offers a richer vocabulary for ideas such as purpose and human flourishing than the language of optimisation and efficiency ever can. Building on this, I then turned to the idea of creation: if religious traditions say God created humans in the divine image, what does it mean that we're now building AI in our own image, and should we be worried that our machines might inherit our worst habits? Finally, I asked whether - in a world of 'black box' decision‑making - theologians might be better placed than scientists to deal with the opacity and mystery of AI.
Opening the floor to questions, what emerged was less a neat set of answers and more a shared willingness to sit with complexity and disagreement. Rather than asking only how to regulate or control AI, the discussion kept widening out into issues of justice, liberation and whose values are being encoded into these technologies. At the same time, the conversation repeatedly returned to the limits of what AI can meaningfully capture about human life.
It was in this context that the theological contributions of our speakers came into sharp focus. I particularly enjoyed listening to Jeremy draw on a theological view of humans as creatures among other creatures, suggesting there may be nothing 'exceptional' about us except that we're perhaps the most fallible beings on this terrestrial sphere - and that the humility born of our frailty is a crucial, if often neglected, part of being human. By contrast, Ali‑Reza leaned into Islamic theological and ethical resources, using AI’s impressive rationality, speed and efficiency as a mirror to remind us that there's more to being human than processing information; non‑rational qualities and a broader sense of human potentiality are vital for thinking about AI alignment, justice and liberation.
As I closed the event, I shared a feeling that several attendees echoed in the chat: we ended with more questions than we started with. That, I think, is precisely the point of a gathering like this. Over the course of the evening, we began to peel back the layers of the 'black box' and saw more clearly that theology, and the wider arts and humanities, have a substantial role to play in shaping public conversations about AI. I'm deeply grateful to Ali-Reza and Jeremy for navigating such complex territory with clarity and generosity, and to everyone in the audience for the thoughtful, challenging questions and reflections.
I look forward to continuing this conversation together. It's more important than ever.

Dean J. Hill



Comments