Does active belief in a higher power correlate with ready acceptance of what an AI chatbot says? A new study by Nazarbayev and Duke universities suggests just that, noting that people who are regularly “thinking about” God appear to be more open to hearing what artificial intelligence has to say.
The study aimed to investigate the psychological factors that influence people’s attitudes toward AI-based recommendations. Specifically, the researchers explored the role of “God’s” salience in decision-making and whether thinking about God makes people more or less likely to trust AI systems over human experts.”
The study asked participants to write about God and their daily activities. The participants would then choose between two options: one proffered by a human, and one suggested by an AI. The experiments ranged from relatively benign decisions like what to eat to more life-changing decisions like investments and romantic partners.
Several study groups were convened with varying demographic details. Researchers said Study 1 began with 405 US-based participants but ended with 321 participants after removing those who failed the attention check or had duplicate IP addresses. Study 2d included 191 participants from Turkey. Study 6 involved 53,563 participants from over 21 countries.
The results suggest that people were more receptive to AI advice due to feelings of humility and recognition of human flaws.
“Artificial intelligence—once merely the draw and drama of science fiction—is now a feature of everyday life. AI is commonly used to generate recommendations, from the movies we watch to the medical procedures we endure,” the study said. “As AI recommendations become increasingly prevalent and the world grapples with its benefits and costs, it is important to understand the factors that shape whether people accept or reject AI-based recommendations.”
In one experiment, researchers played religious music in a dentist’s office waiting room for over a week and asked participants to choose between omega-3/fish oil supplements.
“In preregistered study 2d, conducted in a dental clinic in Turkey, we manipulated God salience through the music played in the waiting room,” the researchers said. “We alternated playing either a religious or nonreligious instrumental traditional Turkish song in the waiting room over 8 [days] of data collection.”
Researchers said the experiment showed participants who listened to religious music were likelier to choose the AI-suggested fish oil than those who didn’t.
“AI is now a ubiquitous part of everyday life for much of the world—perhaps even akin to the pervasiveness of God,” researchers said. “Given the diminished role of humans when viewed in relation to God and within AI operations, might there be a relationship between how thoughts of God affect people’s reactions to AI?” the researchers asked.
The study also hoped to understand how and why thinking about God might influence individuals’ willingness to consider AI-based suggestions.
While the study did not say how long the experiment ran, the researchers said they focused on religion because it is one factor prevalent across nearly every society. Experts have warned against the reliance on artificial intelligence and its effects on the human brain, especially concerning children who easily form bonds with inanimate objects.
“Many of life’s most consequential decisions—deciding which medical procedure to undergo, which romantic partner to pursue, which financial or legal paths to follow, etc.—can now be largely delegated to artificial intelligence,” the researchers said. “Empowered by algorithms that very often surpass humans in their efficiency and accuracy, AI has the potential to significantly affect people’s well-being and the world’s economy.”
“We predict that thoughts of God will weaken the extent to which consumers favor humans over algorithms, driven by feelings of a small self and a recognition of human limitations,” the study continued, adding that the study employed different methods of heightening “the salience of God to establish a causal relationship between God salience and algorithm aversion.”
Paradoxically, while the study said people who thought often about God were more likely to accept AI-generated suggestions, it also pointed out that people tend to have a negative bias toward algorithms.
“People are particularly likely to assume that humans are more capable than algorithms when it involves making judgments for contexts that are subjective or hedonic in nature, or those that require empathy and a consideration of individual uniqueness,” the study said.
Duke University and Nazarbayev University have not yet responded to Decrypt’s request for comment.