Ask the new artificial intelligence (AI) writing program ChatGPT to write a news article about itself from a Christian perspective, and here’s how it begins:
“ChatGPT, a language model developed by OpenAI, has been making waves in the technology industry for its ability to generate human-like text. However, as with any new technology, there are concerns and questions about its implications and use. From a Christian perspective, it is important to consider the ethical and moral implications of ChatGPT and other similar AI technologies.”
The marvel of computer-produced writing that appears to be human generated has left some in awe and others in fear of the new frontiers in cheating it opens.
Southern Baptist ethicists say it’s not one or the other. While AI writing software creates new temptations for dishonesty, it also may have legitimate uses in higher education and local churches.
“One of the temptations when faced with complex or challenging ethical questions with technology is to rush to a position of full adoption or rejection of these tools,” said Jason Thacker, chair of research in technology ethics and director of the research institute at the Southern Baptist Convention’s Ethics & Religious Liberty Commission. “Wisdom, which is at the core of the Christian moral tradition, calls us to slow down and to think deeply about the nature of these tools as well as the myriad of uses.”
Sparking a range of headlines
News of ChatGPT broke in late 2022. A free AI software, the program responds to user questions and instructions within seconds by generating writing that is complex and nuanced, though sometimes wooden and uninspiring. It can tackle anything from high school English assignments to economic analysis.
OpenAI, the company that produced ChatGPT, was founded in 2015 by Elon Musk and Amazon among other investors.
The latest advance in AI writing drew a range of headlines, from “Will Everyone Become a Cheat?” in the Guardian to “Why Educators Shouldn’t Be Worried About AI” in Christianity Today.
Where to draw the line
The news left Christian ethicists pondering whether ChatGPT and other AI writing programs might have legitimate uses.
Andrew Walker, Southern Baptist Theological Seminary associate professor of Christian ethics and apologetics, says they do, so long as they are not employed to produce content meant to be generated by a human mind.
“We should shun AI if it’s being used for malevolent, dishonest purposes,” Walker said. “If it can be used to make research more efficient, I’m willing to entertain that component of it. But if anyone is ever going to pass off work as their own that isn’t their own, that’s a flat violation of the ninth commandment.”
Walker already uses AI in his research and imagines that software like ChatGPT could enhance the process. Zotero helps him collect, search and cite both books and articles. Evernote assists him with note taking, while Grammarly identifies problems with his writing.
“If AI is being used to streamline research for better, more efficient ends, I think I’m OK with that,” he said. But he draws the line at using AI to create academic or pastoral content intended to be produced by a human.
New Orleans Baptist Theological Seminary professor of ethics Jeffrey Riley draws a similar line. ChatGPT already is on the radar of administrators at New Orleans Seminary, he said. They are discussing how to root out AI use as a form of plagiarism. The plagiarism-detection software Turnitin has said it will add an AI writing detection feature to its software later this year.
But AI writing is not inherently bad, Riley said. Composing directions, detailed reports and even church bulletin copy may be among its legitimate uses.
“At this point I’m not willing to say [AI writing software is] inherently wrong,” Riley said. “Where efficiency and description are key and where labor is perhaps lacking, [it] could be very helpful.”
Troubling concerns
Sermon writing is a different story. When a popular YouTuber known as the Honest Youth Pastor asked ChatGPT to write a sermon on John 3:16, he declared the result “more solid than any progressive Christian sermon I’ve ever heard” and “a better sermon than most well-known pastors” could produce.
That troubles Riley. Utilizing AI is never an appropriate substitute for developing the human mind and wrestling with ideas, he said. That’s especially true in sermon preparation.
“A pastor needs to wrestle with the Word of God and not just put words into a program and let it spit out something. There’s something intuitively wrong about that,” Riley said. The point of preaching is to “come and talk to your people with your words.”
Wrestling with ideas also is important in higher education, Thacker said. The advancement of AI writing software will require teachers to shift from an emphasis on information transfer that focuses primarily on students’ output to whole-person transformation that focuses on students’ thinking.
“AI systems like ChatGPT are deeply concerning but also afford the opportunity for educators and students to evaluate with fresh eyes the purpose and design of education,” said Thacker, author of “The Age of AI: Artificial Intelligence and the Future of Humanity.”
What it means to be human
The irony of AI writing software is that its nonhuman robots may help us better understand what it means to be human.
“The introduction of advanced AI systems has fundamentally challenged much of what we have assumed about the uniqueness of humanity,” Thacker said.
“These systems are now performing tasks that only humans could in past generations,” he noted. “In an age of emerging technologies like AI, we all need to be reminded that the value and dignity of humans isn’t rooted in what we do, but who we are as unique image bearers of our Creator.”