When AI becomes indispensable: How ChatGPT is reshaping healthcare workflows and mental health

Healthcare worker in scrubs seated at a desk in a medical office, typing on a computer keyboard while looking at the monitor. Office desk has a phone, documents, and desk lamp.

Overview

  • 3 in 5 healthcare employees report ChatGPT has helped reduce burnout by streamlining tasks like documentation, note-taking, and patient communication.
  • While AI use yields more positive than negative results, 14% of healthcare employees have felt emotionally dependent on an AI tool like ChatGPT (i.e., checking it impulsively, using it to cope, or feeling anxiety without access).
  • 13% say an AI tool outage is more stressful than an electronic health records system crash or a last-minute shift change, and 12% consider it more disruptive than a colleague calling out sick.
  • 42% are aware of OpenAI’s new mental health guardrails designed to prevent over-reliance or harm, but only 41% of them feel they go far enough to protect emotionally vulnerable users.

AI is changing the face of healthcare, not just operationally, but emotionally. Tebra surveyed 301 healthcare professionals to understand how deeply tools like ChatGPT have become embedded in daily clinical work. What emerged was a picture of growing reliance, hidden emotional ties, and an evolving digital ecosystem that's both helpful and, at times, unsettling. For private practices, these insights reveal both the promise and potential pitfalls of AI-powered care.

Key takeaways

  • 3 in 5 healthcare employees report ChatGPT has helped reduce burnout by streamlining tasks like documentation, note-taking, and patient communication.
  • While AI use yields more positive than negative results, 14% of healthcare employees have felt emotionally dependent on an AI tool like ChatGPT (i.e., checking it impulsively, using it to cope, or feeling anxiety without access).
  • 13% say an AI tool outage is more stressful than an electronic health records system crash or a last-minute shift change, and 12% consider it more disruptive than a colleague calling out sick.
  • 42% are aware of OpenAI's new mental health guardrails designed to prevent over-reliance or harm, but only 41% of them feel they go far enough to protect emotionally vulnerable users.

The emotional side of efficiency: AI becomes a fixture in healthcare routines

ChatGPT and similar tools are becoming part of the everyday rhythm in healthcare. What started as a time-saver is now something many providers and staff rely on to get through the day.

Infographic titled “How often healthcare employees use AI at work” with a circular chart showing usage: 33% daily, 33% weekly, 7% monthly, 9% less than monthly, and 18% never. Additional stats: 35% of Gen Z healthcare employees use AI daily, 39% of healthcare administrators use AI daily, and 44% of healthcare employees working at a private practice use AI daily. Source: Tebra Study.
  • 33% of healthcare employees use AI daily as part of their workflow; another 33% use it weekly.
    • 35% of Gen Z healthcare employees use AI daily; 75% use it daily or weekly.
    • 39% of healthcare administrators use AI daily; 75% use it daily or weekly.
    • 44% of healthcare employees working at a private practice use AI daily.

This shift means rethinking how work gets done in your practice. As AI tools become part of the daily routine, practices can support staff by offering guidance on when and how to use them effectively. Private practices can integrate AI into existing healthcare management systems from clinical documentation, appointment scheduling software, patient communication platforms, and medical billing workflows, to improve efficiency without adding complexity. The key is making sure these tools work in the service of care, not in place of it.

Infographic titled “AI burnout or AI breakthrough?” with key insights: 56% of healthcare employees say AI tools lighten their workload, 33% feel they’re both helpful and burdensome, and 3 in 5 say ChatGPT has reduced burnout. 14% admit feeling emotionally dependent on AI. Bar graph shows: 47% want AI tools to flag emotional distress, 47% have used ChatGPT for emotional processing, 32% trust AI to recognize distress, and 27% feel AI understands them better than humans. Also, 27% have deleted personal input from ChatGPT due to judgment/privacy concerns. Source: Tebra Study.
  • 56% of healthcare employees say AI tools lighten their workload, while 33% feel they're a mix of helpful and burdensome.
  • 3 in 5 report ChatGPT has reduced burnout by streamlining tasks like documentation, note-taking, and patient communication.
  • 14% admit feeling emotionally dependent on AI tools: checking them compulsively, relying on them to cope, or feeling anxious without access.
  • 47% want to be notified if an AI tool flags their input as emotionally distressed or mentally unwell.
  • 47% have used ChatGPT (or similar) for emotional processing.
  • 27% have deleted something emotional or personal from ChatGPT out of fear of judgment or privacy concerns.

AI becomes essential to the workflow: Nearly 3 in 10 healthcare employees now say tools like ChatGPT are indispensable in their daily clinical work.

Efficiency takes center stage: About one-third report AI makes them more efficient, with another quarter saying it boosts both efficiency and overall effectiveness.

Peer-to-peer adoption grows: 1 in 3 have already recommended AI tools like ChatGPT to colleagues, not just for clinical tasks but also for coping with stress or streamlining admin work.

Concerns beneath the surface: More than half believe AI tools risk masking deeper systemic issues in healthcare, including staff shortages, admin overload, and gaps in mental health support.

Guardrails spark debate: 42% are aware of new AI guardrails meant to protect mental health, but only 41% of those aware feel the safeguards go far enough to support emotionally vulnerable users.

AI offers real potential to ease stress and improve workflows, but it's not a one-size-fits-all solution for every private practice. Some staff may find relief in automation, while others could feel uneasy about how much they rely on it. Practices can support healthy use by creating space to talk about the emotional impact of these tools, sharing clear usage guidelines, and reinforcing that AI is there to assist and not replace the human side of care.

What happens when AI goes down? Stress and uncertainty ripple through care teams

As AI becomes more embedded in clinical workflows, even a brief outage can cause disruption. For many healthcare teams, losing access to these tools means more than lost time. It creates stress, uncertainty, and a sudden scramble to adjust.

Infographic titled “AI outage anxiety” showing how stressful outages are for healthcare employees: 22% not stressful, 57% mildly annoying, 17% moderately stressful, and 5% extremely stressful. Additional findings: 1 in 8 employees (13%) say an AI outage is more stressful than an EHR crash or shift change, and 12% say it’s more disruptive than a colleague calling in sick. Source: Tebra Study.
  • Nearly 1 in 4 healthcare employees (23%) have experienced a disruption with an AI tool that impacted their clinical workflow; 35% haven't experienced this, yet they worry about it happening.
  • 22% find it at least moderately stressful when their AI tool isn't working as expected.
  • 13% say an AI outage is more stressful than an EHR crash or last-minute shift change.
  • 12% say an outage is more disruptive than a colleague calling out sick.
  • Only 27% are aware of a documented backup plan if their AI tools go down.

For private medical practices, this highlights the importance of treating AI tools like other critical healthcare technology infrastructure, similar to how practices prepare for EHR system downtime or patient scheduling software outages. Having a simple backup plan in place and making sure the entire team knows what it is can prevent confusion and minimize downtime. Whether that means switching to manual processes or having alternative tools ready, a little preparation can go a long way in keeping care consistent when technology fails.

Conclusion: Building a smarter, safer approach to AI at work

AI is becoming a regular part of clinical workflows, offering real benefits in efficiency and burnout reduction. But with that growing dependence comes new responsibilities for private practices. Creating clear guidelines for use, checking in on the emotional impact, and preparing for potential outages can help teams get the most out of these tools without becoming overly reliant. With the right support, AI can enhance patient care delivery and practice operations while keeping healthcare providers and patient relationships at the center of independent practice success.

Methodology

For this study, we surveyed 301 healthcare employees about their AI usage and how much it has become a part of their workflow. Among respondents, 6% reported as baby boomers, 19% reported as Gen X, 57% reported as millennials, and 18% reported as Gen Z.

About Tebra

Tebra, headquartered in Southern California, empowers independent healthcare practices with cutting-edge AI and automation to drive growth, streamline care, and boost efficiency. Our all-in-one EHR and billing platform delivers everything you need to attract and engage your patients, including online scheduling, reputation management, and digital communications.

Inspired by "vertebrae," our name embodies our mission to be the backbone of healthcare success. With over 165,000 providers and 190 million patient records, Tebra is redefining healthcare through innovation and a commitment to customer success. We're not just optimizing operations — we're ensuring private practices thrive.

Fair use statement

If you'd like to share or reference these insights, you're welcome to do so for noncommercial purposes. Please include a link back to Tebra with proper attribution.

Written by

Jean Lee, managing editor at The Intake

Jean Lee is a content expert with a background in journalism and marketing, driven by a passion for storytelling that inspires and informs. As the managing editor of The Intake, she is committed to supporting independent practices with content, insights, and resources tailored to help them navigate challenges and succeed in today’s evolving healthcare landscape.

Reviewed by

Andrea Curry, head of editorial at The Intake

Andrea Curry is an award-winning journalist with over 15 years of storytelling under her belt. She has won multiple awards for her work and is now the head of editorial at The Intake, where she puts her passion for helping independent healthcare practices into action.

Subscribe to The Intake: A weekly check-up for your independent practice