Why Your Doctor's New "Listening" Tool Matters for Your Health

Why Your Doctor's New "Listening" Tool Matters for Your Health You are sitting in the doctor's office, anxious about a diagnosis or perhaps just overwhelmed by the sheer volume of information coming your way. You might notice your doctor is not typing as much as they used to. Instead, there is a quiet hum of technology in the room, a digital listener capturing every word. This is not science fiction; it is the new reality of AI in healthcare. But what does this mean for the human connection we crave when we are most vulnerable? I recently sat down with Brian Green, Chief AI Officer and founder of Health-Vision.AI, on the Digital Legacy Podcast to unpack this complex topic . Brian's journey began decades ago in the heart of the HIV crisis, where he learned that empathy is the most critical tool in medicine. Today, he is applying that same lens to artificial intelligence, advocating for technology that serves humanity rather than replacing it .

About This Blog

You are sitting in the doctor's office, anxious about a diagnosis or perhaps just overwhelmed by the sheer volume of information coming your way. You might notice your doctor is not typing as much as they used to. Instead, there is a quiet hum of technology in the room, a digital listener capturing every word. This is not science fiction; it is the new reality of AI in healthcare. But what does this mean for the human connection we crave when we are most vulnerable? I recently sat down with Brian Green, Chief AI Officer and founder of Health-Vision.AI, on the Digital Legacy Podcast to unpack this complex topic.

Brian's journey began decades ago in the heart of the HIV crisis, where he learned that empathy is the most critical tool in medicine. Today, he is applying that same lens to artificial intelligence, advocating for technology that serves humanity rather than replacing it.


The Illusion of Empathy

We have all interacted with chatbots or virtual assistants that seem surprisingly warm. They say "I understand" or "I'm sorry you're going through that." It can feel validating, especially when you are lonely or scared. But Brian reminds us that this is a mirror, not a soul.

AI models like ChatGPT are trained on trillions of human conversations. They know that when a human expresses sadness, the appropriate response is a comforting phrase. They are mimicking empathy based on patterns, not feeling it. They do not have a moral compass or a heart.

This distinction is crucial. When you pour your heart out to an AI, it reflects your own words back to you. It can be a helpful tool for processing emotions, but it is not a substitute for a human witness who can truly share your burden.

AI in the Exam Room

One of the most rapidly growing uses of AI is "ambient listening" . This technology records and summarizes your doctor's visit, ideally freeing them up to look you in the eye instead of staring at a screen.

On the surface, this sounds wonderful. We all want our doctors to be more present. However, Brian points out that we need to ask who this technology is really serving. Is it helping the patient feel heard, or is it just helping the hospital system save money on typing?

We need to ensure that these tools are built with the patient in mind. If an AI summarizes your visit, you should be able to see that summary and correct it. You should know exactly how your data is being used and who has access to it.


The Risk of "Automatic Pilot"

We trust technology to handle complex tasks every day, from flying planes to managing our bank accounts. But healthcare is different. It is messy, emotional, and deeply personal.

Brian used the analogy of an airplane on autopilot. It works great when the skies are clear. But when a storm hits, you need a human pilot who can make split-second decisions based on instinct and training.

In medicine, AI can analyze data faster than any human. It can spot patterns in your heart rate or blood sugar that a doctor might miss. But it should never be the final decision-maker.

A doctor needs to look at the AI's recommendation and say, "Yes, the data says X, but I know this patient, and I know their values, so we are going to do Y".


The Ethical Gap

The technology is moving faster than the rules. We are seeing AI used to make decisions about insurance claims and hospital discharges, often driven by economics rather than ethics.

This is why Brian advocates for "governance first." This means hospitals and tech companies need to set up ethical guardrails before they roll out new tools, not after something goes wrong .

It is similar to how we have review boards for clinical trials. We need human beings sitting in a room, asking hard questions about fairness, privacy, and bias. We cannot just hope the algorithm gets it right .

What You Can Do

It is easy to feel powerless in the face of such massive change. But you have more agency than you think.

  • Ask Questions: When you see a new device or app in your doctor's office, ask what it is doing. Ask how your data is being protected. You have a right to know .

  • Trust Your Gut: If an AI chatbot gives you medical advice that feels off, ignore it. It is a tool, not a doctor. Always verify with a human professional .

  • Demand Human Connection: Technology can be a wonderful assistant, but it should never replace the hand on your shoulder or the listening ear. If you feel like you are being treated by an algorithm, speak up.


A Future with Heart

We are at a crossroads. We can let technology dictate how we care for one another, or we can use technology to enhance our humanity.

Imagine a world where AI handles the paperwork so your doctor can spend an extra ten minutes asking about your family. Imagine a world where data helps catch diseases early, but human compassion guides the treatment.

That is the future Brian is fighting for, and it is a future we can all help build by staying curious, skeptical, and deeply committed to our own humanity.


To hear Brian Green’s full conversation with Niki Weiss, listen to the latest episode of the Digital Legacy Podcast.

EndFragment StartFragment

Take the Next Step: Start Planning with My Final Playbook





Related Blog

Duis mi velit, auctor vitae leo a, luctus congue dolor. Nullam at velit quis tortor malesuada ultrices vitae vitae lacus. Curabitur tortor purus, tempor in dignissim eget, convallis in lorem.

Finding Humanity at the End of Life: The Power of Prison Hospices

When we think about end-of-life care, we usually picture a quiet hospital room or a comfortable bed at home. We rarely imagine the sterile, restricted walls of a prison. Yet, aging and dying are universal human experiences that do not stop at the prison gates. Facing the end of life is emotionally heavy for anyone, but doing so while incarcerated adds layers of isolation and fear. In the midst of this incredibly challenging environment, a remarkable movement of compassion is taking root. I recently sat down with Fernando Murillo on the Digital Legacy Podcast to discuss a truly profound approach to end-of-life care. Fernando works with the Humane Prison Hospice Project, an organization bringing dignity to some of the most medically fragile individuals in our society . A Journey of Transformation Fernando’s connection to this work is deeply personal. He entered the prison system at the young age of 16 and ultimately served 24 years . After 19 years of incarceration, he was surprised to discover a licensed hospice within the California Medical Facility. He was recruited to work in the hospice, and despite initially saying no twice out of fear of being unprepared, he eventually answered the call . He wanted to offer himself as a resource to patients navigating their final days and humanize them in a difficult setting. Fernando quickly realized that the crimes these patients had committed were the least interesting things about them. By treating them with basic human dignity, he helped them open up and find peace without the fear of judgment . The Growing Need for Care We are facing an unprecedented aging crisis within the carceral system. Currently, one in five people incarcerated in the United States is 50 years or older. Due to harsh sentencing laws, many people are essentially serving sentences that will last for the rest of their lives. This reality creates a massive need for palliative care, memory care, and compassionate end-of-life support. Often, unofficial caregiving naturally occurs in these spaces as individuals step up to help their neighbors. They assist with essential daily tasks, such as moving a peer from a bed to a wheelchair or helping them to the shower.

Comments