AI is entering caregiving the way many consequential changes enter ordinary life: quietly, unevenly, with a blend of curiosity, relief, caution, and unresolved tension. For caregivers already carrying emotional weight, practical responsibility, and ongoing uncertainty, the arrival of AI brings a serious question. How can a new tool offer real support while preserving the human presence caregiving depends on?

In episode 142 of A Time to Care, released on April 10, 2026, Isabel Melgarejo welcomed Dr. Jeremy Holloway for a conversation on AI and caregiving that reaches far beyond technology alone. At the heart of the episode is a timely concern for families, clinicians, and communities alike. Where should AI assist, and where should human judgment, intuition, memory, and compassion remain firmly in place?

See the full podcast episode here: https://pod.co/a-time-to-care-the-caregivers-podcast/ai-and-caregiving-staying-human-in-a-digital-world-with-dr-jeremy-holloway

Key takeaways

  • AI can ease mental and administrative strain for caregivers.
  • Human discernment remains central in every meaningful care decision.
  • Presence, trust, context, and moral responsibility still belong to people.
  • The deeper issue is less about what AI can do and more about what caregivers should continue to hold for themselves.

Why this conversation matters right now

Family caregiving now stands as one of the defining realities of American life. Millions of adults are helping aging parents, partners, relatives, or loved ones navigate health systems, medications, appointments, paperwork, memory changes, emotional distress, and daily uncertainty. Many are doing so with very little formal preparation. Many are doing so while balancing work, family life, financial pressure, and fatigue.

Pressure like this makes AI attractive for understandable reasons. A caregiver may want help summarizing medical visits, organizing medication questions, drafting updates to siblings, comparing options, clarifying unfamiliar terms, or turning a pile of scattered notes into something coherent. In those moments, AI can feel like relief.

Relief, though, is only part of the story. A tool that sounds smooth can also sound wiser than it really is. A polished answer can create confidence before trust has been earned. A fast summary can flatten nuance. A convenient suggestion can drift away from the emotional, cultural, relational, and ethical texture of a real family’s life. Caregivers are therefore facing more than a technology question. They are facing a discernment question.

The line that matters most

One of the most powerful ideas surrounding this episode is the invitation to let AI sit beside the caregiver without taking the caregiver’s place. A distinction like this deserves careful attention because caregiving has never been limited to efficiency. Caregiving is relational work. Caregiving involves memory, tone, timing, patience, history, emotional perception, and the ability to sense meaning beneath words.

A caregiver often hears what a chart never captures. A caregiver notices the long pause before an answer, the look that signals fear, the shift in routine that suggests discouragement, or the family tension shaping a decision. Human beings bring context to care in a way no system can fully replicate. Technology may support the work, yet the work itself still lives in relationship.

Age friendly care points in the same direction. The 4Ms framework begins with What Matters, then moves to medication, mentation, and mobility. Order matters here. Care begins with the person, their priorities, their values, and their lived experience. Tools should strengthen that starting point rather than compete with it.

What AI can do well

AI becomes most useful when it helps caregivers create order from overload. It can support preparation. It can help organize information. It can translate dense language into something more accessible. It can reduce clerical drag. It can help a caregiver walk into an appointment with clearer questions and leave with a more usable summary.

A role like this carries real value. In a world where families are stretched thin, any tool that reduces friction and gives people back a measure of clarity deserves serious attention. Used thoughtfully, AI may create more breathing room for listening, noticing, and being present. In that sense, the best version of AI in caregiving supports humanity rather than competes with it.

What must remain human

Even with all of its strengths, AI does not carry moral responsibility. AI does not love your parent. AI does not understand the meaning of a family’s long history. AI does not know which silence is peaceful and which silence signals resignation. AI does not sit at the bedside holding grief and hope at the same time.

Caregiving asks more of a person than task management. It asks for judgment shaped by relationship. It asks for interpretation shaped by memory. It asks for patience shaped by dignity. A caregiver is often making decisions inside ambiguity, and ambiguity is where human wisdom becomes essential.

Human connection also has direct health relevance. Social isolation and loneliness influence emotional wellbeing, cognitive decline, and overall health outcomes for older adults. Presence, therefore, is more than a sentimental ideal. Presence is part of care itself.

Why Dr. Holloway’s perspective carries weight

Dr. Holloway’s contribution in this conversation carries unusual depth because it reflects a larger body of thought he has been developing around human efficacy, communication, and wellbeing. Across his work, one theme continues to rise: human beings must remain active authors of meaning even as technology becomes more powerful. Another theme follows closely beside it: communication is far more than a technical skill. Communication shapes trust, belonging, understanding, and care.

Placed inside caregiving, those ideas become especially powerful. A caregiver’s role depends on more than access to information. A caregiver needs confidence in judgment, steadiness in presence, and language that helps people feel seen rather than processed. Dr. Holloway’s perspective helps move the conversation beyond shallow arguments about whether AI is good or bad. A better question emerges. Does this tool help human beings stay more grounded, more perceptive, and more connected to what matters most?

A practical filter for using AI well in caregiving

Caregivers do not need a polished philosophy before using a tool. Caregivers need a grounded filter. A few questions can help:

Does this reduce burden or add another layer of confusion?

Does this help me understand my loved one more clearly and more compassionately?

Does this protect privacy, dignity, and trust?

After using this, do I still feel anchored in my own judgment?

Questions like these may sound simple, yet they reach the center of the issue. Caregiving is interpretive work. Caregiving is ethical work. Caregiving is deeply human work.

The larger message

A false choice often appears in conversations about AI. One side imagines total embrace. Another side imagines total resistance. Caregivers deserve something wiser than either extreme. A more mature path allows technology to handle friction where it genuinely helps, while human beings continue to hold relationship, discernment, context, and moral responsibility.

A future worth building will give caregivers more support without asking them to surrender their voice. A future worth building will use tools to create more room for humanity, more room for careful listening, more room for eye contact, more room for confidence, and more room for the kind of presence people remember long after a clinical interaction ends.

Final reflection

This episode of A Time to Care matters because it reaches a question many people feel yet struggle to articulate. Families are searching for help. Healthcare keeps growing more complex. Technology keeps advancing. Through all of it, one truth remains steady. Care works best when human beings stay fully engaged.

AI may assist. AI may organize. AI may clarify. Human presence still leads.

See the YouTube episode here: https://youtu.be/a-Tw-fS0Rqo

Closing up!

Listen to the full episode of A Time to Care featuring Dr. Jeremy Holloway, “AI and Caregiving: Staying Human in a Digital World.” Then explore more of Dr. Holloway’s work on human efficacy, communication, social connection, and age friendly care here on the site. Healthcare organizations, caregiver groups, and professional audiences seeking a keynote, workshop, or training on human centered communication, caregiving, AI, and health equity are welcome to connect directly through Dr. Holloway’s website.

Schedule appointment

Jeremy Holloway

Providing expert consulting in cross-cultural communication, burnout elimination, SDOH, intergenerational program solutions, and social isolation. Helping organizations achieve meaningful impact through tailored strategies and transformative insights.

Leave A Comment