Gaps in Our Understanding: AI, Gods, and Humanity

By Randall Reed

Dr. Randall Reed is a Professor of Religion at Appalachian State University in North Carolina USA. He is currently working on the intersection of religion and technology. He is co-chair of the American Academy of Religion Research Seminar on Artificial Intelligence and Religion . His latest article (co-authored with Laura Ammon) is entitled “Is Alexa My Neighbor?” and is forthcoming in Journal of Posthuman Studies: Philosophy, Technology, Media.

Randall Reed

Dr. Randall Reed is a Professor of Religion at Appalachian State University in North Carolina USA. He is currently working on the intersection of religion and technology. He is co-chair of the American Academy of Religion Research Seminar on Artificial Intelligence and Religion . His latest article (co-authored with Laura Ammon) is entitled “Is Alexa My Neighbor?” and is forthcoming in Journal of Posthuman Studies: Philosophy, Technology, Media.

In response to:

Artificial Intelligence and Religion

Chris Cotter and Beth Singler discuss the intersections between religion and Artificial Intelligence from slavery and pain to machines taking over religious functions and practices.

I am delighted to have an opportunity to respond to Dr. Beth Singler’s interview for the Religious Studies Podcast. As anyone who has had the opportunity to hear Dr. Singler in the past knows, she is always brilliant, entertaining, and specializes in making the field of artificial intelligence, particularly as it relates to religious studies, intelligible to the non-computer scientist. While there are many issues that Dr. Singler raises as part of her discussion during the podcast, I would perhaps identify two main strands that we might explore. On the one hand is the question of how is A.I. like or not like humans? On the other hand, how is A.I. like or not like God? The second question may seem impertinent, and yet perhaps particularly to the amazement of those, like Dr. Singler, who really understand artificial intelligence, that question becomes more relevant each day.

 

Let me start with the first question, however: how is A.I. like humans? It is here that I think we enter a highly contested area: Intelligence. Does or can A.I. have intelligence? And of course, there is first the question of definition. What constitutes intelligence? There is no easy answer as is evident in the rich history of historical, psychological and philosophical debate over this issue. Likewise, this discussion becomes mired in an anthropocentric view; human intelligence is considered the “gold standard” of intelligence. But certainly, in the animal world, we see various levels of intelligence. The squirrels in our yards, as they subvert ever-increasing obstacles, seem to exhibit intelligence. Of course, we recognize that the squirrels are not doing trigonometry (at least consciously). Still, we generally don’t require equivalence before we proclaim that an animal is “smart.”

 

Often times though, when we talk about artificial intelligence, we want to demand the higher standard of human intelligence. There is a running joke in the A.I. community that intelligence is that which a computer hasn’t done until it does it and then intelligence is something else. So intelligence was to beat a grandmaster at chess until a computer did that. Then it was beating a computer a Go (the world’s most popular board game) until it did that. And so on. As artificial intelligence continues its inexorable trek to besting humans on various tasks (see Sebastian Ruder’s “leaderboard,” which seeks to track A.I. competence on a wide range of natural language tasks), the question of what constitutes intelligence becomes ever more confusing.

Above, a Buddhist temple, includes a robotic priest designed after the deity of mercy. It already delivers sermons, and supporters hope that it will continue to grow in intelligence to share more complex information over time.

Thus, both the question of intelligence and the concomitant problem of how an A.I. is and is not like a human being are not simple. In the interest of space, I have not engaged the ethical issues involved (much like the podcast), but there are a variety of ethical issues both hypothetical and real that are raised as A.I.s become able to do more things that humans can do but at an exponentially faster rate.

 

The second issue that I would like to raise is how A.I.s are and are not like the divine. The paper that Dr. Singler presented at the University of Edinburgh and another version at the American Academy of Religion in San Diego broached this topic. In her paper, the issue is not an ontological question — whether A.I. is actually god-like (though some scholars have speculated that in the future A.I. may, in fact, be indistinguishable from our western conception of a god) — rather the question that Dr. Singler asks is whether humans are treating A.I. like a god?

 

And here the answer seems to be, at least sometimes, “yes.” For as Dr. Singler’s study shows, there is at least a micro-trend on Twitter talking about being “blessed by the algorithm.” The use of that kind of religious language, which seems to indicate a kind of divination of A.I., is present regardless of the actual nature of the algorithm. Dr. Singler notes that this often seems to be a kind of habit among humans, that we seem to fall into religious tropes and narratives when we encounter the unknown.

 

Perhaps this is an extension of the “God of the Gaps.” This notion postulates that when human knowledge cannot determine the cause of something, we often turn to supernatural explanations. Before we developed a scientific understanding of lightning, we saw it as a divine act. Once science exposed its natural causes, that bit of divinity in our world was erased. The circle that once was the magisterium of religion that encompassed the universe has thereby been today reduced to a small space in one’s heart.

 

I would like to go a bit further than that here, because with the case that Dr. Singler uncovers, it is not simply that people do not understand how the algorithms make their decisions (which is true of not just the uninitiated but the creators of those algorithms as well), but rather these A.I. have power. The Uber driver or YouTuber who seeks the promise of remuneration recognizes that those mechanical intelligences whose “ways are not our ways” can cast either blessing or curse. It is this power that holds human life in its grasp, acting in seemingly inexplicable ways, that has the petitioner suddenly using the language of religion.

 

And yet, the danger of the God of the Gaps is that it surrenders human responsibility for the state of the world. It exchanges supplication for experimentation and explanation. Whatever problems there may be with the Enlightenment (and they are legion to be sure, several of which were elucidated in the podcast), at one level, the Enlightenment represents a moment in which human beings took responsibility for the understanding of their world. We are at a time in which we stand again with a requirement for decision. The retreat to religious language, while not an unusual strategy for humanity, must not be our final riposte. We must once again find the way to understand and perhaps master this new force that seems increasingly arrayed against us in the deployment of A.I. Dr. Singler points out our slippage into the language of religion and at the same time urges us to not remain there.

Other EPISODES YOU MIGHT ENJOY

Drone Metal Mysticism

Podcast

In this interview, Owen Coggins joins us to talk about the use of religious (and sacrilegious) language and imagery in Drone Metal, a genre which stretches metal to low, slow, repetitive extremes. Drawing on the work of Michel de Certeau, he tells David Robertson that the prevalence of language relating to mysticism and "spiritual experience" may be due to the genre's focus on the physicality of the musical experience.
The Therwil Affair: Handshakes in Swiss Schools

Podcast

In this podcast, taking place on the last day of the Annual EASR Conference in Bern, Dr Philipp Hetmanczyk and Martin Bürgin of Zurich University talk to Thomas White about the Therwil Affair, a controversy that emerged in 2016 after two Swiss Muslim schoolboys declined to shake hands with their female teacher.
Hyper-Real Religion, Digital Capitalism, and the Pygmalion Effect

Podcast

In this interview, recorded at the SocRel 2017 conference in Leeds, Professor Adam Possamai discusses the rising popularity of ‘Hyper-Real religion’ – a category encompassing Jediism, Matrixism, and other movements taking influence from popular culture. Situating hyper-real religions within the contemporary context of digital capitalism,
Carl Jung

Podcast

Carl Gustav Jung (1875-1961) was a Swiss psychiatrist. Initially a collaborator with Sigmund Freud, the two later split and Jung went on to found the Analytical Psychology school of psychotherapy. His approach focussed on what he called the process of individuation, ...
Witchcraft and Demonic Possession in Early Modern England

Podcast

Emeritus Professor Philip Almond discusses his work on witchcraft and demonic possession in early modern England, including issues such as the "familiar cultural script" that was usually played out, the strategic interests of those making accusations, and the broader context of post-Reformation turmoil in which confessional claims to truth took on new urgency.
Belief, Belonging, and Academic Careers

Podcast

Almost twenty years ago, Grace Davie observed that despite plenty of studies into the ‘exotic edges’ of religion, ‘the picture in the middle remains remarkably blurred’. Seeking to address this imbalance and engage with the ‘beliefs of ordinary British people in everyday life’, Abby Day's recent book, ...

This work is licensed under a Creative Commons Attribution- NonCommercial- NoDerivs 3.0 Unported License.

The views expressed in podcasts, features and responses are the views of the individual contributors, and do not necessarily reflect the views of The Religious Studies Project or our sponsors. The Religious Studies Project is produced by the Religious Studies Project Association (SCIO), a Scottish Charitable Incorporated Organisation (charity number SC047750).