AI in Health Care: Saving Time to Focus on What Matters Most

flo gator ai

By Tamara G. R. Macieira, PhD, RN, and Bryce Catarelli, DNP, APRN, FNP-C

Artificial intelligence (AI) is rapidly transforming how care is delivered and experienced. While AI holds tremendous potential to reduce documentation burden and enhance patient-centered care, its integration requires careful consideration of risks such as bias, misinformation and impacts on trust. Striking the right balance may allow health care professionals to use AI to “buy back time” and refocus on what matters most — safe, evidence-based, human-centered care.

In this month’s FloGatorAI blog, I’m joined by my colleague Dr. Bryce Catarelli, UF Nursing clinical assistant professor and family nurse practitioner, who brings a vital frontline perspective to the conversation about AI in health care. Together, we explore two sides of the same transformation: how AI may help clinicians reclaim time through tools like ambient documentation, and how patients are increasingly bringing AI-generated insights into the exam room.

As AI becomes embedded in both clinical workflows and patient decision-making, a critical question emerges: How should healthcare professionals use AI responsibly, and how will AI reshape the role, judgment and trust placed in those clinicians from the patient’s perspective?

The Workflow: Reclaiming Time With Ambient Listening

Documentation burden remains a major driver of clinician burnout, with many clinicians spending hours after work charting in electronic health records. Ambient AI scribes offer a promising solution by passively capturing clinical conversations and generating structured notes, potentially reducing time spent typing and clicking..

Clinicians have already started to embrace ambient AI documentation. Through this potential “time gained back” from less hands-on documentation, clinicians may be able to maintain greater eye contact with patients, allow conversations to flow naturally and conduct more thorough assessments. They are no longer tethered to the keyboard or concerned about missing key details. This shift could lessen cognitive burden and allow clinicians to rediscover the joy in their work: building relationships with their patients and improving their health.

If implemented thoughtfully, such AI workflow improvements may contribute to reduced burnout, expanded access to care by allowing more patients to be seen and, ultimately, improved retention of health care professionals. However, a major hurdle is automation bias, or the cognitive tendency to overtrust automated systems and accept their work without sufficient critical review.

After a long workday, it may be tempting to accept an AI-generated note that appears polished and complete. The bigger risk is not just a documentation error, but a subtle cognitive shift from active clinical reasoning to passive verification. In health care, failing to notice errors or built-in bias is unacceptable. While AI may capture visit details effectively, it should never replace clinician judgment or drive clinical decision-making. To mitigate these risks, health care professionals need clear policies, training to recognize potential bias, reinforcement of clinician accountability, and workflow designs that keep them cognitively engaged, which leads to the second, equally important component of this conversation.

The Patient: From ChatGPT to the Clinic — Trusting Clinical Expertise

Patients are already using AI. Prior to their medical appointment, they often enter their symptoms into an AI platform to seek a diagnosis, medication recommendations or interpretation of lab results. Historically, it was common to hear, “I don’t know, you’re the doctor. Whatever you say,” but now many arrive with conclusions and requests in hand before vital signs are taken.

In many ways, this shift can be empowering, encouraging patients to engage more actively in their care. Yet the information they receive from AI may also be incomplete or misleading. While some AI-generated insights may be accurate, others can miss critical nuance. We should always remember that nursing and medicine are both a science and an art. Even when symptoms appear similar, treatment decisions must be individualized based on each patient’s unique context and underlying conditions.

Here is where reclaimed time becomes powerful.

If ambient AI reduces documentation burden, clinicians may be able to redirect that time and cognitive energy toward deeper listening, clearer explanations and more meaningful patient engagement. Rather than focusing on data entry, providers can strengthen their role as trusted advisers and advocates, helping patients interpret information, clarify nuance and translate complex insights into personalized clinical guidance.

If technology can assume more of the documentation burden, clinicians can focus where they are most needed — at the patient’s side, navigating complexity, validating truth and strengthening trust in the exam room. Used wisely, AI has the potential to buy back time — and, in health care, time is more than efficiency. It is attention, listening and discernment.

In an environment saturated with data, the scarcest resource is not information, but accountable human expertise — reminding us that the true promise of AI in saving time to focus on what matters the most lies in protecting and prioritizing that human connection.