The Human Touch in an AI World: Why I'm More Excited Than Scared


Over dinner last week, I was telling my wife about a patient I’d taken to the OR for a uterine biopsy. It’s a procedure I don’t do as often as I did during residency, so I didn’t have my standard postoperative instructions ready.

“I used OpenEvidence to generate evidence-based discharge instructions,” I told her. “It took less than two minutes. Between that, Doximity’s AI tools, and DAX for ambient scribing, there’s no telling how much time I save in a day compared to a year ago.”

She nodded, sharing her own experience with DAX in her reproductive endocrinology practice. Then we started talking about what’s coming next. Ambient ordering in the next couple of months. All the possibilities that might follow.

We’re both physicians watching AI transform our daily work in real time. I’m thrilled about these developments, and sometimes I wonder what they mean for the future of medicine. Both feelings are shaping how I think about where our profession is headed.

The Changing Landscape

AI is transforming healthcare faster than most of us anticipated. Diagnostic algorithms are catching things we might miss. Administrative tasks that used to consume hours can now be completed in minutes. The tools I use regularly are genuinely improving patient care while giving me time back.

But they’re also changing what it means to be a physician. Tasks that once required extensive training can now be automated. The question isn’t whether AI will continue advancing but how we evolve alongside it while preserving what matters most about medicine.

My Human-First Response

Instead of viewing AI as a threat, I’m seeing it as an opportunity to return to what drew me to medicine originally: the profound connections that happen when someone trusts you with their most vulnerable moments.

AI can analyze symptoms and suggest treatments, but it can’t be present in the operating room when I hold someone’s hand as they fall asleep for surgery. It can’t look at a patient and say, “I’m glad you’re here. I think we can make this better.”

This perspective is reshaping how I want to practice. I’m trying to double down on the irreplaceably human aspects of medicine:

Practicing narrative medicine. Especially with my older patients, I’m working to make time to ask: Who are you as a person? Where have you been? What are you proud of? What scares you? These conversations often reveal crucial context that influences treatment decisions.

Embracing shared decision-making. The treatments I offer are rarely one-size-fits-all. What someone chooses needs to align with their values, their tolerance for risk, their life circumstances. This requires deep conversation and genuine partnership.

Being present for pivotal moments. Just because I perform surgery every day doesn’t mean it’s routine for my patients. For them, it’s often a seminal moment. That requires a different kind of attention and care.

Developing clinical intuition. While AI excels at pattern recognition, human intuition built through years of patient interaction remains irreplaceable. That sense when something isn’t quite right, the ability to read what someone isn’t saying, the experience to know how to adapt standard protocols to unique situations.

What I’m Trying to Do Differently

Practically, this human-first approach means I’m attempting to change how I spend my time:

With AI handling documentation and routine tasks, I’m trying to have more bandwidth for meaningful patient interactions. I want to focus on the person in front of me rather than the computer screen.

I’m trying to ask different questions during consultations. Not just about symptoms, but about hopes, fears, and what “better” looks like for each individual.

I’m working to take more time explaining not just what we’re going to do, but why, and what the alternatives are. Real informed consent requires genuine conversation.

I’m trying to pay attention to the moments that matter most. When someone is nervous before surgery, when they’re processing difficult news, when they’re trying to make a complex decision about their care.

Finding Your Medical Edge

If you’re wondering how to position yourself in an AI-enhanced healthcare landscape, consider:

Where do you provide value through relationship and judgment rather than just clinical knowledge?

What aspects of patient care require emotional intelligence and human connection?

How can you create more space for the conversations that really matter?

What parts of your practice depend on understanding someone as a whole person, not just a set of symptoms?

The Opportunity Ahead

What excites me most is AI’s potential to free us from administrative burdens so we can focus on what we trained for: taking care of people.

Imagine spending less time on documentation and routine tasks, and more time listening, explaining, comforting, and healing. Imagine having the cognitive space to really think about each patient’s unique situation.

Healthcare will keep evolving, and we’ll need to adapt. But I believe the future belongs to physicians who can blend technological tools with genuine human connection.

As a urogynecologist, my goal isn’t long-term relationships with patients. I want them to get better and never think about me again, except maybe when they’re telling friends how treatment gave them back their freedom and independence. But the path to that outcome still requires everything that makes us human: listening, understanding, and caring about their individual story.

AI won’t replace what happens between two people when one trusts the other with their health and vulnerability. If anything, it might make those moments more precious.

That’s why I’m more excited than scared.


How are you thinking about AI in your practice? I’d love to hear from other physicians navigating these changes.


Copyright © 2016-2025 Ryan Stewart, DO.
The information provided is for educational purposes only and should not be considered medical advice. Always consult with a qualified healthcare professional for personalized medical guidance.