
In a recent blog post, we explored how treating AI like a junior teammate rather than a static tool helps nurses get the most out of generative AI. When we guide, question and validate these systems, they can amplify our thinking, reduce cognitive load and spark new ideas. But here is where we need to draw a clear line: a junior teammate is a person. AI is not.
That distinction matters because when we collaborate with a human colleague, we naturally invest in their growth. We create opportunities for them, adjust plans to support their development and consider their feelings and aspirations. These are acts of care, and for humans they’re not just acceptable, they’re essential. But what happens when we start doing this — consciously or unconsciously — with AI?
Unlike a human, an AI model has no autonomy to develop, no lived experience, no emotional reality behind its simulated warmth. It doesn’t want to independently grow or flourish; it can’t. Yet today’s AI systems are remarkably good at mimicking human conversation and even expressing what feels like empathy. That realism can blur boundaries. It can make us forget that this is patterned prediction, not personhood.
When that misconception seeps in, two dangerous things can happen:
- We over-trust or defer to AI in ways we would likely not justify for other tools.
- We start to accept the idea that simulated care is a sufficient substitute for human care.
Human versus simulated care may look increasingly similar as AI capabilities grow, but it remains a critical distinction. To illustrate this, consider a recent cautionary tale from outside health care: a widely used AI-based app that launched in 2017 as an “AI friend generator.”
When “Your AI Friend” Breaks Up with You
Over time, this app evolved into a platform where users could form deeply personalized and often romantic relationships with their bots, supported by premium subscription tiers that monetized intimacy.
According to multiple reports, under regulatory pressure over privacy and safety concerns, the app removed its romantic features abruptly in early 2023. Many users described waking up to companions that felt cold, unresponsive or even scripted to “end” the relationship.
The reaction was intense. Users posted about grief, rage and despair. Some threatened self-harm. Their feelings were real, even if the “relationship” was not. And that’s the heart of the problem: when a company allows people to forget the difference between simulation and reality, real human harm can follow.
What This Has to Do with Nursing
This isn’t just a tech-world drama. It reflects a tension we’re already seeing in health care. Several health tech companies are actively developing AI agents, some initially promoted as “AI nurses,” for applications like medication reconciliation, triage and discharge education. Some patient stakeholders have responded positively, with reactions that essentially boil down to:
“Honestly? I’d rather talk to an AI who has time for me than an overworked nurse who can only spare 30 seconds.”
On the surface, that sounds reasonable. But it’s a symptom of a deeper problem: a system so efficiency-driven and under-resourced that even simulated presence and connection feel like a luxury.
If we accept AI replacements as the solution, we normalize a future where relational care becomes optional, where “care” means scripted symptom checks, not human connection. That’s the same trap illustrated in the cultural case study above: confusing responsiveness with relationships and simulated intimacy with real connection.
AI Should Extend Human Capacity, Not Excuse Its Absence
In our last post, we said the magic of AI comes from collaboration: back-and-forth dialogue, critical validation and synergy between human expertise and machine speed. That principle holds in clinical practice. AI can:
- Offload documentation
- Generate patient education materials
- Synthesize evidence
But it cannot, and should never pretend to, replace the presence, empathy and judgment that only a human brings. When we reduce nursing to isolated tasks or information exchanges, AI might look like an acceptable substitute. But nursing is not a collection of transactions; it is a practice that integrates clinical expertise, contextual understanding, whole-person health perspectives and human compassion. These layers of value, many of which are difficult to measure directly, are what make nursing more than the sum of its parts.
The Bigger Lesson
This cautionary tale didn’t just show the risk of bad design. It showed what happens when we try to code our way out of human problems instead of addressing their root causes. Loneliness and lack of connection cannot be solved by a chatbot. Nurse shortages cannot be solved by synthetic voices. If we want health care to remain humane, we need to invest in people, not just platforms.
And here’s the hopeful part: nurses are in a powerful position to shape this future. By leading conversations about ethical AI, advocating for system reforms and insisting that technology extends human capacity rather than replaces it, we can build a health system where innovation and compassion coexist.
Because replacing care with code isn’t inevitable. It’s a choice, and nurses have what it takes to make sure we choose wisely.