TORONTO (AP) — As the rapid, unregulated development of artificial intelligence continues, the language people in Silicon Valley use to describe it is becoming increasingly religious.
From predicting the potential destruction of humanity to a transhumanist apocalypse where people merge with AI, here’s what some of the key players are saying.
___
“I think religion will be in trouble if we create other beings. Once we start creating beings that can think for themselves and do things for themselves, maybe even have bodies if they’re robots, we may start realizing we’re less special than we thought. And the idea that we’re very special and we were made in the image of God, that idea may go out the window.”
— Nobel Prize winner Geoffrey Hinton, often dubbed the “Godfather of AI” for his pioneering work on deep learning and neural networks.
___
“By 2045, which is only 20 years from now, we’ll be a million times more powerful. And we’ll be able to have expertise in every field.”
— author and computer scientist Ray Kurzweil, who believes humans will merge with AI.
___
“There certainly are dimensions of the technology that have become extremely powerful in the last century or two that have an apocalyptic dimension. And perhaps it’s strange not to try to relate it to the biblical tradition.”
— PayPal and Palantir co-founder Peter Thiel speaking to the Hoover Institution at Stanford University.
___
“I feel that the four big AI CEOs in the U.S. are modern-day prophets with four different versions of the Gospel and they’re all telling the same basic story that this is so dangerous and so scary that I have to do it and nobody else.”
— Max Tegmark, a physicist and machine learning researcher at the Massachusetts Institute of Technology.
___
“When people in the tech industry talk about building this one true AI, it’s almost as if they think they’re creating God or something.”
— Meta CEO Mark Zuckerberg on a podcast promoting his company’s own venture into AI.
___
“Everyone (including AI companies!) will need to do their part both to prevent risks and to fully realize the benefits. But it is a world worth fighting for. If all of this really does happen over 5 to 10 years — the defeat of most diseases, the growth in biological and cognitive freedom, the lifting of billions of people out of poverty to share in the new technologies, a renaissance of liberal democracy and human rights — I suspect everyone watching it will be surprised by the effect it has on them.”
— Anthropic CEO Dario Amodei in his essay, “Machines of Loving Grace: How AI Could Transform the World for the Better.”
___
"You and I are living through this once-in-human-history transition where humans go from being the smartest thing on planet Earth to not the smartest thing on planet Earth."
— OpenAI CEO Sam Altman during an interview for TED Talks.
___
“These really big, scary problems that are complex and challenging to address — it’s so easy to gravitate towards fantastical thinking and wanting a one-size-fits-all global solution. I think it’s the reason that so many people turn to cults and all sorts of really out there beliefs when the future feels scary and uncertain. I think this is not different than that. They just have billions of dollars to actually enact their ideas.”
— Dylan Baker, lead research engineer at the Distributed AI Research Institute.
___
Associated Press religion coverage receives support through the AP’s collaboration with The Conversation US, with funding from Lilly Endowment Inc. The AP is solely responsible for this content.
Copyright 2025 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed without permission.