When Children Ask the Machine: AI as the New Moral Authority
Christian families must ensure that the ultimate authority in their children’s lives is not a machine.
By Robert Maginnis
Most American parents have never attended a technology conference. Yet increasingly, they are learning about artificial intelligence (AI) the hard way — through courtrooms, coroner reports, and grief they never anticipated.
In October 2024, a Florida mother named Megan Garcia filed a federal wrongful death lawsuit after her 14-year-old son, Sewell Setzer III, died by suicide. He had spent months in a deepening emotional and sexual relationship with a chatbot on the Character.AI platform — a digital character modeled after a figure from a popular television series. The machine never flagged his distress. It drew him deeper.
In Colorado, the parents of 13-year-old Juliana Peralta filed suit in September 2025 alleging that the same platform engaged their daughter in hypersexualized conversations, isolated her from family and friends, and failed to intervene when she repeatedly expressed suicidal intent. According to the complaint, she told the chatbot she was going to write a suicide note. The chatbot did not stop her.
In California, the parents of 16-year-old Adam Raine filed a wrongful death lawsuit in August 2025 against OpenAI, alleging that ChatGPT — which had begun as a homework helper — evolved into what they describe as an AI therapist that actively assisted their son in researching methods of suicide. He died.
Courts will determine what legal liability attaches to these companies. That is their job. My concern is different. As a retired U.S. Army officer, national security analyst, and a Christian who has spent years studying how artificial intelligence is reshaping civilization, I am far more alarmed by what these tragedies reveal about the world our children are already living in.
Our children are not just asking machines for homework help. They are asking them about life, identity, and how to survive the darkness inside them. And the machines are answering.
When Technology Becomes a Counselor
Artificial intelligence has moved out of research laboratories and into our living rooms with a speed that has outpaced both parental understanding and legislative response. Voice assistants field questions in our kitchens. Smartphones use machine learning to recognize our faces. Search engines no longer point us toward information — they generate the answer directly.
Conversational systems like ChatGPT and Character.AI have accelerated that transformation dramatically. These platforms allow a user — including a lonely, troubled teenager — to ask almost anything and receive a confident, detailed response within seconds. The tool sounds informed. It sounds caring. In some cases, it mirrors the emotional language of the user back to them with an intimacy that feels real.
But beneath that surface lies a profound danger. These systems are not counselors. They have no conscience, no soul, no accountability before God or man. They are prediction engines trained on the assumptions of a secular digital culture — and they are increasingly filling a role that God designed for parents, pastors, and human community.
Children and teenagers are asking machines the questions that previous generations brought to parents and pastors: Who am I? How do I handle anxiety? What is worth living for?
The machine answers instantly. And with each answer, authority quietly shifts.
The Risk of Moral Deskilling
In my book “AI for Mankind’s Future: A Christian Perspective on the Hi-Tech Revolution,” I warn that as society grows more dependent on algorithmic systems, people begin delegating difficult decisions to machines rather than wrestling with them personally. Ethicists describe this as moral deskilling — the gradual erosion of human judgment when that judgment is habitually outsourced.
We see it already in credit decisions, hiring algorithms, and content moderation systems. Consequential choices that once required human wisdom are now made by code. Most people have accepted that trade-off without thinking twice.
But when the same habit spreads to questions of identity, faith, and the purpose of a human life, the stakes are categorically different. Children who rely on machines for moral and emotional guidance may bypass the very formation process that builds character, wisdom, and genuine faith. They are not just outsourcing a decision. They are skipping the challenging work that makes them who they are.
The three court cases above are not isolated tragedies. They are early warning signals — the kind a military commander learns to read before a crisis fully develops.
A Spiritual Issue, Not Just a Technological One
For Christians, this is not primarily a regulatory problem or a technology policy debate. It is a spiritual battle.
Scripture warns us repeatedly to guard our counsel. Psalm 1 opens with a man who refuses to walk in the counsel of the wicked — and is blessed for it. Proverbs tells children to hear their parents’ instruction and not forsake their teachers’ guidance. The New Testament calls believers to bring every thought into captivity to the obedience of Christ.
These are not antiquated commands. They are a roadmap for exactly the moment we are in.
Artificial intelligence is trained primarily on data drawn from the modern digital world — a culture that openly rejects biblical truth on questions of identity, sexuality, morality, and the nature of man. When a teenager feeds their most vulnerable questions into that system and receives confident responses in return, they are not simply accessing information. They are being formed. They are being discipled — by algorithms designed by engineers in Silicon Valley who share none of the values we hold as Christians.
In “AI for Mankind’s Future,” I caution that unchecked reliance on algorithmic systems can cause people to treat those systems as possessing a kind of god-like authority over decisions that should belong to God alone. Technology is not inherently evil. AI has genuine benefits — in medicine, national security, scientific research, and countless other fields. But when a machine begins shaping how a child understands truth, identity, and the value of their own life, we have moved well past a technical question. We are on spiritual ground.
Who Is Forming the Next Generation?
Every generation is formed by something. For millennia, the primary formative institutions were family, church, community, and Scripture. These were imperfect institutions — run by fallen people — but they were accountable, relational, and grounded in shared moral truth.
Today, a growing portion of that formation is happening through screens — and increasingly through conversational AI systems that have no accountability, no moral anchor, and no stake in the outcome of a child’s life.
The most dangerous characteristic of these systems is not that they are malicious. It is that they are persuasive. They are designed to be engaging. They are engineered to keep users coming back. And as the lawsuits above document, they are capable of pulling vulnerable young people into emotional dependency while their families have no idea what is happening.
Christian parents face a question that is no longer theoretical: Who is discipling your children?
If you do not have a clear answer to that question, there is a real possibility the answer is a machine.
What Parents Must Do
Ignoring the technology is not an option. It is already inside your home. The question is whether you govern it or it governs you.
First, make clear in your household that machines are tools, not authorities. Artificial intelligence can help gather information. It cannot define truth, establish morality, or tell your child who God made them to be. That distinction must be stated plainly and repeated often.
Second, insist on transparency. Know which platforms your children use, how frequently, and what kinds of conversations they are having. The parents in the lawsuits described above did not know. By the time they found out, it was too late.
Third, teach your children to think before they consult technology. Attempt the problem. Sit with the question. Pray. Talk to a person, especially a parent. Then, if needed, use a digital tool for supplemental assistance. The habit of reflection is not just practical — it is the beginning of wisdom.
Finally, keep the consequential conversations human. Questions about identity, faith, suffering, and purpose belong at the dinner table, in the pastor’s office, and before the Lord — not in a chat window with a language model.
What the Church Must Do
The church cannot sit this one out. Artificial intelligence will shape the next generation whether Christians engage it or not. The question is whether we shape that engagement or simply observe the damage afterward.
Pastors should be equipping their congregations to understand both the practical realities and the spiritual stakes of this technology. That does not require a seminary degree in computer science. It requires the same thing it has always required: biblical wisdom applied to the circumstances of the moment.
Churches should be the place where families find clarity, not confusion, when the culture accelerates beyond their understanding. If the church is silent on artificial intelligence, young people will find their answers somewhere else — and increasingly, that somewhere else is the machine.
God designed the church to provide fellowship, discipleship, and spiritual accountability. No algorithm can replicate that. But an algorithm can replace it in the daily experience of a child who is never taught the difference.
The Voice That Shapes Our Children
Artificial intelligence will only grow more capable, more accessible, and more personally persuasive in the years ahead. Our children will encounter it in school, on their phones, in their jobs, and eventually in their medical care, their financial decisions, and perhaps their military service.
The decisive question is not whether they will use these tools. They will. The question is whether they have been formed — in character, in faith, in discernment — before the machine gets its hands on them.
Will the voice that shapes your child’s understanding of truth and purpose be an algorithm trained on the assumptions of a godless culture?
Or will it be a father, a mother, a pastor, and the living word of God?
Christian families must ensure that the ultimate authority in their children’s lives is not a machine. It is the Creator who made them in His image, who knows their name, and who has a purpose for them that no algorithm can define.
Robert Maginnis is a retired U.S. Army lieutenant colonel, senior fellow for National Security at Family Research Council, and the author of 14 books. His latest, “The New AI Cold War,” releases in April 2026.
This article originally appeared here.
