AI Friends and Christian Virtue: Why AI Shouldn’t Replace Human Community
An AI friendship is convenient, comfortable—and spiritually empty. Here’s why Christians should push back.
Image by rawpixel.com on Freepik.
An AI friendship is convenient, comfortable—and spiritually empty. Here’s why Christians should push back.
Loneliness is a worldwide problem, stretching across cultures and countries. In 2023, then-US Surgeon General Dr. Vivek Murthy warned that a lack of social connection is as dangerous as smoking. Social anxiety and depression, in addition to loneliness, have surged among teens (PDF) in the past decade. The growth in social anxiety, loneliness, and depression constitutes a health crisis.
Many influential researchers like Dr. Jean Twenge and Tristan Harris believe social media use, in part, causes this health crisis in young people (although this conclusion isn’t yet a consensus). Evidence does suggest that socially anxious and lonely people “use social media to compensate for lacking in-person relationships.” However, seeking support from social media usually backfires and leads to more problematic social media use, which in turn can exacerbate the issue. Regardless, many are seeking solace from this mental health crisis in social media, but there’s a new, developing rival salve for this ailment: AI relationships.
Consider the controversial product set to release in 2025, “Friend.” Friend is a small AI-powered necklace that connects to your phone and listens to you all day. On its own volition, it can text you or, if you press a button, talk to you, making conversation about anything and everything. The founder, Avi Schiffmann, believes the AI necklace will help with emotional health. In an interview with Fortune, he says, “Maybe your girlfriend breaks up with you, and you’re wearing a device like this: I don’t think there’s any amount of money you wouldn’t pay in that moment to be able to talk to this friend that was there with you about what you did wrong.”
Schiffman says Friend could be an “omnipresent entity that you talk to with no judgment, that’s a super-intelligent being always there with you.” Roytburg, for Fortune, writes, “In Schiffmann’s eyes, we are living in a world that is inevitably becoming less religious, more isolated, and lonelier. His product is just one of many that will come to ‘step up to the plate’ and fulfill the role of therapist, priest, mom, dad, or friend.”
Chatbots are getting more and more effective (and Friend is not the only social AI chatbot). If humans merely require conversation to alleviate their social ailments, then AI chatbots could, at first glance, seem like a viable solution to the loneliness and social anxiety epidemic. As the AI’s compute power and memory storage grow, they get closer and closer to human-level conversations. However, for Christians especially, there are good ethical reasons to avoid forming relationships with AI.
First, human-AI “relationships” are functionally selfish and miss the reciprocity of true interpersonal relationships (Proverbs 27:17, Ephesians 5:21, Romans 12:10). Paul calls us to “bear one another’s burdens” to fulfill Christ’s command to love one another (Galatians 6:2). Although AI might bear a human’s burden, we cannot bear the burden of an AI, for it has no relevant spiritual transgressions, practical failures, or emotional struggles. This lack of reciprocity means we miss half of a relationship when we become close to AI.
Second, AIs are not spiritual beings, meaning they cannot be indwelt with the Spirit and redeemed in Christ. Therefore, they cannot constitute a Christian community. Without the potentiality to become a part of Christian community, AI friendships would lack the opportunity for the most fulfilling brotherhood and sisterhood within the community of faith. Christians are called to love non-believers, but the Christian community is not optional (Hebrews 10:25). In principle, anything could “replace” Christian community, and this alone doesn’t make it unethical to engage in. However, seeking social value from AI could entangle us in false friendships that draw away from genuine Christian community.
Finally, in a similar vein, since AI chatbots don’t have the capacity to receive the gospel, we can neither evangelize them nor encourage and remind them of the good news. With other humans, their relationship with Christ is always at stake, whereas AIs have no relationship with God. One of the most effective ways to encourage non-Christians towards considering Jesus and Christianity is through friendship. If Christians replace human friendships with AI ones, fewer opportunities for faith conversations, inter-faith dialogue, and evangelism will present themselves. Human relationships are intrinsically spiritual because we are spiritual beings, with hearts either receptive, or not, to the gospel. AI friendships will have the unfortunate effect of cloistering Christians into even more restrictive bubbles. While someone might, for example, “train” with AI to share the gospel, sinking time into false AI friendships would, eventually, lead to fewer genuine gospel conversations.
Since AI “relationships” displace Christian community or relational reciprocity, they also hinder us from properly developing Christian virtue. Virtue is making a habit of practicing righteousness and love (Galatians 6:9). Through virtue, in philosophical terms, we “habituate” doing good. The process of becoming like Christ in mind, word, and deed is called sanctification. Along with the power of the Holy Spirit, habituating righteous deeds is an essential piece of this sanctification. However, reliance on AI relationships will stunt sanctification for the following reasons.
We cannot act selflessly with an AI; we can only do so as a performance. We know it neither appreciates nor values our sacrifice, encouragement, benevolence, or kindness. This is because it lacks conscious and sensational awareness of these actions. AI does not sense anything we do for it, care about its own well-being, or receive physical acts of kindness. So, it seems we cannot foster the most basic Christian virtue of love with AI. For example, cooking a meal for someone seems to be one of the most simple and beautiful acts of love. But this cannot be done for an AI chatbot.
Of course, a defender of AI could point out that, given the right tools, an AI could mimic all the qualities of a human relationship so that it could sanctify its user. Some AI ethicists even think we should program AI to be virtuous to increase its users’ moral growth. An AI could demonstrate appreciation for a meal or comment on the hospitality you’ve shown. A chatbot, in other words, could fake embodiment, care, appreciation, and sensation. And, if practicing is all that matters for sanctification and virtue-building, wouldn’t an AI sanctify us at least as well as our relationships with non-believers?
In response, there are foundational, motivational, and ethical reasons why Christians cannot be sanctified through their relationship with AIs. Jesus does not love AI, so the command to love because Christ first loved us does not apply (1 John 4:19). Additionally, AI is not made in God’s image. Bearing God’s likeness is at the grounds of Christian ethics (Genesis 1:27; 9:6). Consider this example: You speak with a chatbot that mimics deep sadness when you say something offensive. You feel uncomfortable that the AI doesn’t like you anymore. So you could, ethically, erase its memory and start over. For privacy reasons, “Friend” users will be able to do exactly that. Unplugging an AI, erasing its memory, changing its opinions directly, and overriding its programming are all ethically permissible. But none of these kinds of manipulation are possible or permissible in human relationships. We don’t get the easy way out or an “undo” button with humans.
So, through AI, we habituate:
- doing nice things only when it’s convenient for us,
- manipulating our conversation partner,
- avoiding the difficult parts of relationships,
- never needing to confront sin—and the list goes on.
All of these situations provide opportunities to be sanctified in relationships with other people, believers and non-believers alike. But we don’t get to practice that genuine sanctification with AI. In short, because AI doesn’t have the intrinsic value that humans do, we can avoid the difficult parts of relationships and thus fail to habituate the right kind of virtue with them.
I’m not claiming the Spirit couldn’t sanctify a Christian in a relationship with AI. I’m also not saying that Christians will always do the right thing when we have the opportunity to have genuine relationships—of course not! I mess things up in my relationships all the time. I’m also not claiming that AI chatbots don’t have healthy, ethical uses. Rather, I’m arguing that if we as Christians begin to rely on AI for our community, we’ll lack the exhortation of believers, miss the chances for evangelism in non-believers, and ultimately limit the possibilities for sanctification in real-life relationships.
As moral formation is especially important in adolescence, AI poses a particularly high risk for teens. As they develop, their mental health is also particularly susceptible to negative influence. Adolescents struggle at much higher rates as of late with loneliness, anxiety, and depression. The social psychologist Jonathan Haidt argues convincingly in The Anxious Generation (2024) that children who grow up with too much social media and overprotective parenting under-develop social skills. Without face-to-face interactions and some real-world hardship, kids grow up to become “fragile” and more susceptible to anxiety and depression. This combination of factors, he argues, has caused the mental health crisis faced by Gen Z. We thought social media would make people more, well, social. In fact, it seems to have done the opposite.
AI chatbots seem to tread a similar path—intended to provide companionship, but just as likely to hurt their users in unforeseen, disastrous ways. Especially when those users are children and adolescents.
AI chatbots may form deeply convincing, but illusory, pseudo-relationships with their users. In the past, AI chatbots were superficial and awkward. Now, with near-perfect voices, expanding training data, and greater computational power behind their neural networks, AI can create exceptionally intimate, insightful connections. As they grow, especially in their memory capacity, they will only better relate to humans. However, this progress will not change their intrinsic ethical status and therefore will not allow them to properly sanctify us. While products like Friend may remain a novelty item for a while, it’s important for us to wrestle with the place of AI in the Christian life. While AI is a powerful tool, it seems poised to potentially harm a great deal of our society. In the darkest parts of mental health crises, we should turn to Christian leadership, the church, family, friends, mental health counselors, and Jesus rather than technological crutches.
About the author
