Neither Satan nor Savior: A 2,000-Year Old Perspective on AI

  • You’ve avoided AI entirely. Too risky. Too unknown. Too much like opening a door you can’t close.
  • You’ve dabbled cautiously. Free account. Surface-level prompts. Impressive results that also made you uneasy.
  • You’ve leaned in fully. Paid tools. Daily use. But a quiet question still lingers in the back of your mind.

Wherever you land on that spectrum, you’re probably carrying a tension you haven’t fully voiced.

A friend of mine finally said it out loud.

We were in a small group of business leaders. Julia, a spiritual director who regularly travels to Europe and the Middle East helping Christian leaders find focus amid the competing demands of ministry and family, had been sharing how she uses AI as a guide for interpreting dreams. The conversation covered research tools, productivity shortcuts, and content creation. And then Michael spoke up.

Michael has owned a real estate photography business for almost two decades. He’s not technology-averse; he was an early adopter of drones for aerial shots back when most photographers thought they were a gimmick. He’s currently building a smartphone app to reach an entirely different client base. This is someone who leans into new tools.

But he’d been sitting with something. You could tell. He didn’t interrupt; he waited for the right moment. And when it came, he didn’t hold back.

“I want to know your take on AI,” he said. “What are we opening ourselves up to be influenced by something we’re not sure about?”

He paused, then continued.

“It’s not fear exactly. It’s more like a precaution in my brain, or my heart, saying wait. You don’t know who you’re talking to. You don’t know who’s influencing you.”

He was quick to clarify: “I’m not saying AI is a person. It’s not sentient. But I’ve read enough and heard enough to say, okay, hold on. Pump the brakes. Be wise before moving forward.”

Then he named the tension most leaders feel but rarely admit:

“I see a lot of really great benefits. But I also don’t want it to be a honey trap.”

This wasn’t coming from someone who avoids new tools. This was coming from a leader who’s built two businesses on his willingness to adopt new technology before his competitors. And even he was wrestling with this question.

Have you felt that same tension? The pull between possibility and caution? The sense that something significant is happening, but you’re not sure whether to lean in or step back?

Working with AI

Why Your Great-Grandparents Feared the Radio

Michael’s instinct to pause isn’t a weakness. It’s wisdom. Generation after generation, believers have wrestled with this tension when new technology disrupted their world.

And most of the time, Christians have split into two camps.

Camp one says the new thing will save us. Camp two says it will destroy us.

Which camp would you have joined?

Consider the printing press. Before Gutenberg, Scripture was copied by hand by monks who spent their lives inscribing every word. The process was slow, expensive, and limited access to the educated elite. These illuminated manuscripts were works of art; every page reflected years of devotion and skill.

Then the press arrived. Suddenly, Bibles could be mass-produced. The average person could own Scripture for the first time in history. Many historians argue the printing press powered the Protestant Reformation; sola scriptura became possible because people could actually read Scripture alone.

But what was lost? The monks who had devoted their lives to hand-copying Scripture saw their craft become obsolete. Critics worried: if anyone can read the Bible without training, won’t they misinterpret it? Won’t dangerous ideas spread faster than truth?

Were they wrong to worry? Were the reformers wrong to celebrate?

Both concerns were legitimate. Both outcomes were real.

The same pattern repeated with radio. Some Christians called it a tool of Satan; others saw it as the greatest evangelism opportunity since the apostles. Who was right? Television split the church the same way. So did video cassettes, which powered both Bible studies and pornography into private homes. The internet. Social media.

Again and again: this will save us versus this will destroy us.

What if both camps were partially right? What if neither had the full picture?

The Generation That Doesn’t Know What It’s Missing

The historical pattern tells us something important: disruptive technology isn’t new, and neither is the fear it generates.

But there’s another pattern worth naming; one that tends to repeat every generation.

Those of us who built our foundations before the disruption have something the next generation won’t: we know what we’re working with.

Julia has spent the better part of two years developing her expertise in dream interpretation. She regularly consults a 700-page dream dictionary. She’s read three or four thousand pages of books. She’s done the slow, handwritten work of journaling and discerning and learning from mentors who’ve gone before her.

This is on top of years of training as a spiritual director; the kind of deep formation that qualifies her to fly overseas and help leaders untangle the knots of overwhelm and competing priorities.

Now she uses AI; what would have taken months compresses into weeks. But the acceleration works because she already has the base. She knows when AI gives her something useful and when it gives her something off. She’s not afraid of the tool because she has the discernment to evaluate what it produces.

“I can use my discernment,” she said in our conversation. “But I think a lot of people could get in trouble.”

Can you see the generational tension?

You and I learned to think, reason, write, and create before AI existed. We built mental muscles through repetition and struggle. We know what it feels like to wrestle with a blank page, to work through confusion without a shortcut, to develop judgment through trial and error.

For our children and grandchildren, AI is just… normal. It’s always been there. And when something has always been there, you don’t question it. You don’t know what you’re missing because you never had it.

Julia connected this to what’s already happened with social media. “At first, Facebook was like, ‘Wow, I can stay connected to all my friends.’ And then the destruction of the likes, the anxious generation. It has literally changed an entire generation.”

Jonathan Haidt documents this transformation in The Anxious Generation. He writes:

“The transition from a play-based childhood to a phone-based childhood is the single largest factor in the international epidemic of adolescent mental illness. Children need risk, adventure, and independence to develop competence and confidence. When we replaced physical play with virtual interaction, we didn’t just change how children spend their time; we changed how their brains develop. We created a generation that is more anxious, more depressed, and less prepared for adulthood than any in recent history.”

Julia paused after referencing this shift, then said: “I think AI is going to be like this and more.”

Does that concern you? It should. But perhaps not for the reason you think.

The danger isn’t that AI is uniquely evil. The danger is that the pattern repeats. What’s disruptive for one generation becomes invisible for the next. And invisible tools shape us in ways we don’t notice until the damage is done.

The teenager who lets AI write their research paper isn’t being lazy. They’re being normal; by their standard of normal. But what are they not learning by skipping the struggle? What mental muscles never develop? What discernment never forms? What capacity for deep thinking atrophies before it ever strengthens?

Cal Newport, author of Deep Work, has been writing about this tension for years. The ability to focus deeply, to think without distraction, to produce work that requires sustained concentration; these are skills that must be developed through practice. They don’t emerge automatically. And tools that shortcut the struggle may also shortcut the growth.

This isn’t an argument against AI. It’s an argument for awareness.

Those of us with foundations have an opportunity; and maybe a responsibility; to use these tools wisely and to help the next generation understand what they might be missing. What would it look like for you to have that conversation with someone younger? What would you want them to know?

What If Both Camps Are Wrong?

After watching this pattern across decades of technology shifts, I landed on a phrase that captures where I stand:

AI is neither Satan nor Savior.

It’s not a demonic force corrupting everyone who touches it. And it’s not the answer to all our problems. It’s a tool; an extraordinarily powerful one; that will be used for tremendous good and tremendous harm.

The more important question isn’t whether AI belongs in one camp or the other.

The more important question is which camp you will use it from.

Julia put it simply in our conversation: “It can be used for the kingdom, or it can be used for evil. We have to learn to discern; just like any other era of anything in Christianity.”

That word; discern; is the key.

Jim Collins calls this the Genius of the AND. We’re conditioned to think in binaries: this or that, safe or dangerous, embrace or reject. But the most important questions rarely have either/or answers.

AI will compress learning curves and amplify expertise. AI will also create new temptations toward shortcuts, laziness, and dependence.

Both things are true. Can you hold them together?

The monks who hand-copied Scripture weren’t wrong to grieve what was lost. The reformers who distributed printed Bibles weren’t wrong to celebrate what was gained. The wisdom wasn’t in picking the right side. The wisdom was in seeing clearly what each side revealed.

The Wrong Question and the Right One

Many leaders I talk to are asking the wrong questions.

Business executives tend toward panic: Am I falling behind? Is everyone else figuring this out while I’m stuck? The fear of missing out drives them toward hasty adoption without wisdom.

Faith leaders tend toward caution: Is this ethical? Is this wise? What am I agreeing to that I don’t fully understand? The fear of compromise drives them toward paralysis.

Which tendency do you recognize in yourself?

Both fears are understandable. Neither produces clarity. And here’s the problem: both keep you focused on the wrong question.

“Should I use AI?” is a dead-end question.

It assumes a binary; yes or no, in or out, safe or dangerous. And as we’ve seen, binaries rarely capture reality. They certainly don’t capture this one.

“How do I use AI wisely?” opens the door.

Using AI wisley

This question assumes you’ll engage; because opting out entirely may not be a realistic path for leaders who want to serve effectively in the coming decade. But it also assumes you’ll engage with discernment, boundaries, and intention.

What would that look like for you? What boundaries would you need? What intentions would guide your use?

I think of it as three commitments: Discern, Develop, Deploy.

Discern means evaluating what AI produces rather than accepting it blindly. Do you have the foundation to recognize when output is useful versus when it’s off? If not, that’s the first gap to address.

Develop means continuing to build your own thinking, writing, and creating muscles. AI accelerates expertise; it doesn’t replace the need to build it. What practices keep your mental muscles strong even as tools make some tasks easier?

Deploy means using AI intentionally for specific purposes rather than defaulting to it for everything. Where does AI genuinely add value in your work? Where might it actually diminish what you’re trying to create?

Your job isn’t to pick a side. Your job is to hold the tension as wisely as you can.

What Will You Believe?

Michael asked his question because he trusted us with his real concern. He wasn’t looking for permission to avoid AI. He wasn’t looking for validation to dive in recklessly. He wanted wisdom.

My guess is you want the same thing.

So here’s where I want to leave you; not with my answers, but with your own reflection.

Three questions to sit with:

  1. Which camp have you been leaning toward; the FOMO panic of falling behind, or the ethical hesitation of holding back? What’s driving that instinct? What fear sits beneath the surface?
  2. If AI is neither Satan nor Savior, what is it for you? A threat? An opportunity? A mirror? A tool? Something else entirely? What metaphor captures your honest view?
  3. What’s the one thing you want to believe about AI moving forward; the conviction that will guide how you engage with it wisely? What would you write on a card and keep at your desk?

Write that down. Not for me. For you.

Because the leaders most likely to thrive in this next season probably won’t be the ones who avoided AI or the ones who adopted it blindly. I believe they’ll be the ones who engaged with wisdom, discernment, and clear conviction.

The monks weren’t wrong to grieve what the printing press took away. The reformers weren’t wrong to celebrate what it made possible. The wisdom was in holding both truths without collapsing into either camp.

That same wisdom matters now. It mattered with the printing press. It matters with AI.

Your Next Step

If you’re ready to move from “should I?” to “how wisely?”; I’m hosting a webinar on using AI as a leader without losing your voice.

We’ll cover how to engage with discernment, maintain your authentic perspective, and use AI as a tool that amplifies your expertise rather than replacing it.

No hype. No fear. Just practical wisdom for leaders who want to engage thoughtfully.

I’ll see you there.

David Limiero is the founder of Edens View Coaching and Consulting, helping overwhelmed leaders move from overwhelm to overflow.

Scroll to Top