Gaps in Our Understanding: AI, Gods, and Humanity

By Randall Reed

Dr. Randall Reed is a Professor of Religion at Appalachian State University in North Carolina USA. He is currently working on the intersection of religion and technology. He is co-chair of the American Academy of Religion Research Seminar on Artificial Intelligence and Religion . His latest article (co-authored with Laura Ammon) is entitled “Is Alexa My Neighbor?” and is forthcoming in Journal of Posthuman Studies: Philosophy, Technology, Media.

Randall Reed

Dr. Randall Reed is a Professor of Religion at Appalachian State University in North Carolina USA. He is currently working on the intersection of religion and technology. He is co-chair of the American Academy of Religion Research Seminar on Artificial Intelligence and Religion . His latest article (co-authored with Laura Ammon) is entitled “Is Alexa My Neighbor?” and is forthcoming in Journal of Posthuman Studies: Philosophy, Technology, Media.

In response to:

Artificial Intelligence and Religion

Chris Cotter and Beth Singler discuss the intersections between religion and Artificial Intelligence from slavery and pain to machines taking over religious functions and practices.

I am delighted to have an opportunity to respond to Dr. Beth Singler’s interview for the Religious Studies Podcast. As anyone who has had the opportunity to hear Dr. Singler in the past knows, she is always brilliant, entertaining, and specializes in making the field of artificial intelligence, particularly as it relates to religious studies, intelligible to the non-computer scientist. While there are many issues that Dr. Singler raises as part of her discussion during the podcast, I would perhaps identify two main strands that we might explore. On the one hand is the question of how is A.I. like or not like humans? On the other hand, how is A.I. like or not like God? The second question may seem impertinent, and yet perhaps particularly to the amazement of those, like Dr. Singler, who really understand artificial intelligence, that question becomes more relevant each day.

 

Let me start with the first question, however: how is A.I. like humans? It is here that I think we enter a highly contested area: Intelligence. Does or can A.I. have intelligence? And of course, there is first the question of definition. What constitutes intelligence? There is no easy answer as is evident in the rich history of historical, psychological and philosophical debate over this issue. Likewise, this discussion becomes mired in an anthropocentric view; human intelligence is considered the “gold standard” of intelligence. But certainly, in the animal world, we see various levels of intelligence. The squirrels in our yards, as they subvert ever-increasing obstacles, seem to exhibit intelligence. Of course, we recognize that the squirrels are not doing trigonometry (at least consciously). Still, we generally don’t require equivalence before we proclaim that an animal is “smart.”

 

Often times though, when we talk about artificial intelligence, we want to demand the higher standard of human intelligence. There is a running joke in the A.I. community that intelligence is that which a computer hasn’t done until it does it and then intelligence is something else. So intelligence was to beat a grandmaster at chess until a computer did that. Then it was beating a computer a Go (the world’s most popular board game) until it did that. And so on. As artificial intelligence continues its inexorable trek to besting humans on various tasks (see Sebastian Ruder’s “leaderboard,” which seeks to track A.I. competence on a wide range of natural language tasks), the question of what constitutes intelligence becomes ever more confusing.

Above, a Buddhist temple, includes a robotic priest designed after the deity of mercy. It already delivers sermons, and supporters hope that it will continue to grow in intelligence to share more complex information over time.

Thus, both the question of intelligence and the concomitant problem of how an A.I. is and is not like a human being are not simple. In the interest of space, I have not engaged the ethical issues involved (much like the podcast), but there are a variety of ethical issues both hypothetical and real that are raised as A.I.s become able to do more things that humans can do but at an exponentially faster rate.

 

The second issue that I would like to raise is how A.I.s are and are not like the divine. The paper that Dr. Singler presented at the University of Edinburgh and another version at the American Academy of Religion in San Diego broached this topic. In her paper, the issue is not an ontological question — whether A.I. is actually god-like (though some scholars have speculated that in the future A.I. may, in fact, be indistinguishable from our western conception of a god) — rather the question that Dr. Singler asks is whether humans are treating A.I. like a god?

 

And here the answer seems to be, at least sometimes, “yes.” For as Dr. Singler’s study shows, there is at least a micro-trend on Twitter talking about being “blessed by the algorithm.” The use of that kind of religious language, which seems to indicate a kind of divination of A.I., is present regardless of the actual nature of the algorithm. Dr. Singler notes that this often seems to be a kind of habit among humans, that we seem to fall into religious tropes and narratives when we encounter the unknown.

 

Perhaps this is an extension of the “God of the Gaps.” This notion postulates that when human knowledge cannot determine the cause of something, we often turn to supernatural explanations. Before we developed a scientific understanding of lightning, we saw it as a divine act. Once science exposed its natural causes, that bit of divinity in our world was erased. The circle that once was the magisterium of religion that encompassed the universe has thereby been today reduced to a small space in one’s heart.

 

I would like to go a bit further than that here, because with the case that Dr. Singler uncovers, it is not simply that people do not understand how the algorithms make their decisions (which is true of not just the uninitiated but the creators of those algorithms as well), but rather these A.I. have power. The Uber driver or YouTuber who seeks the promise of remuneration recognizes that those mechanical intelligences whose “ways are not our ways” can cast either blessing or curse. It is this power that holds human life in its grasp, acting in seemingly inexplicable ways, that has the petitioner suddenly using the language of religion.

 

And yet, the danger of the God of the Gaps is that it surrenders human responsibility for the state of the world. It exchanges supplication for experimentation and explanation. Whatever problems there may be with the Enlightenment (and they are legion to be sure, several of which were elucidated in the podcast), at one level, the Enlightenment represents a moment in which human beings took responsibility for the understanding of their world. We are at a time in which we stand again with a requirement for decision. The retreat to religious language, while not an unusual strategy for humanity, must not be our final riposte. We must once again find the way to understand and perhaps master this new force that seems increasingly arrayed against us in the deployment of A.I. Dr. Singler points out our slippage into the language of religion and at the same time urges us to not remain there.

Other EPISODES YOU MIGHT ENJOY

Stereotyping Religion: Critical Approaches to Pervasive Cliches

Podcast

"Religions are belief systems", "Religions are intrinsically violent", "Religion is Bullshit"... these are just some of the pervasive cliches that we might hear from time to time in the English-speaking world about our central topic of discussion on the RSP, 'religion'.
Druidry and the Definition of Religion

Podcast

Contemporary Druidry often presents itself as the native spirituality of the British Isles. However, there is not one form of Druidry and there are also significant numbers of Christian and atheist Druids as well as those that combine Druidry with Wiccan or other perspectives and practices. From international organisations to local ‘groves’, there are diverse types of Druid groups, ...
Christmas Special 2017 – Scrape My Barrel!

Podcast

As has now become traditional (how many times must something be repeated to become ‘tradition’? And does this make it ‘religious’?), we are delighted to end 2017 on a more light-hearted note and present our ‘Christmas’ special gameshow,
Religion, Science and Evolutionary Theory

Podcast

Science and evolution in Muslim societies is a complicated topic. Among the public, what does evolution mean? Whats does evolution stand for? Is there a 'Muslim view' on evolution? In this podcast, Stephen Jones interviews Salman Hameed about recent research on Muslim perceptions of science and evolution.
African Christianity in the West

Podcast

‘Africa’. ‘Christianity’. ‘The West’. Three seemingly simple terms with clear referents. Three categories which – perhaps unsurprisingly, to regular listeners of the RSP – have been, and continue to be, associated with and invoked in support of myriad competing agendas, truth claims, ideologies, and more.
Religion as Anthropomorphism

Podcast

In Stewart Guthrie’s interview with Thomas J. Coleman III for The Religious Studies Project, Guthrie begins by outlining what it means to ‘explain religion’. He defines anthropomorphism as “the attribution of human characteristics to nonhuman events” and gives an example of this as applied to auditory and visual phenomena throughout the interview.

This work is licensed under a Creative Commons Attribution- NonCommercial- NoDerivs 3.0 Unported License.

The views expressed in podcasts, features and responses are the views of the individual contributors, and do not necessarily reflect the views of The Religious Studies Project or our sponsors. The Religious Studies Project is produced by the Religious Studies Project Association (SCIO), a Scottish Charitable Incorporated Organisation (charity number SC047750).