Billing professional demonstrates how to use AI in medical coding on computer
  • Never enter PHI into public AI tools like ChatGPT — they’re not HIPAA-compliant.
  • Require a signed BAA from any AI vendor that handles patient information.
  • Use AI for research only; always have coders validate AI-suggested codes for accuracy.

While cybersecurity might not be at the forefront of your priorities as you manage denials, onboard clients, and oversee staff, it becomes increasingly critical to consider its implications as your medical coding and billing teams begin implementing artificial intelligence (AI) solutions

A recent Gallup survey found about 12% of US employed adults use AI (including chatbots like ChatGPT) daily at work, and roughly 25% use it at least several times per week. And increasingly, medical coding and billing tools incorporate some form of AI innately. 

When it comes to protecting patient data, multi-factor authentication, phishing training, and HIPAA policies are critical, but with AI, practices must take it a step further to mitigate risk, said healthcare consultant Aimee Heckman during a recent Tebra webinar, Getting paid: How to capture every dollar in 2026. “When it comes to AI and HIPAA, we need to proceed with caution, especially for anyone experimenting with AI tools like ChatGPT, Grok, or Gemini,” she warned. 

Read on for expert advice and best practices regarding how to use AI in medical coding and billing.

Ensuring HIPAA compliance in a new world of AI

During the webinar, Heckman provided important questions to ask before trusting AI vendors and also explained how to use AI in medical coding and billing without violating HIPAA. Following are 4 of her recommendations:

1. Never enter protected health information (PHI) into public platforms

PHI includes things like names, medical record numbers, dates of service, dates of birth, insurance information, and any other information that someone could use to reasonably identify a patient. “If you’re exploring what they can do, only use de-identified data,” she said. 

"If you’re exploring what [public AI platforms] can do, only use de-identified data."
Aimee Heckman
Healthcare business consultant
Aimee Heckman for Tebra's The Intake

When thinking about how to use AI in medical coding and billing, best practice is to rewrite notes as hypotheticals or remove all identifiers before pasting anything into Standard ChatGPT and similar public tools. 

Standard ChatGPT is not HIPAA-compliant, meaning anything billers enter could be stored or processed in ways that are not permitted under HIPAA. There’s also no legal protection for PHI once a biller enters it into the chatbot, and practices could be liable for improper disclosure.  

TebraWhat Our Customers Are Saying

All case studies
Biller demonstrates medical billing company financial resilience by meeting with doctors and healthcare staff
Medical Billing Company
“Implementing Tebra’s RPA has significantly improved our operational efficiency, client satisfaction, and financial outcomes.”
Bob Trotta headshot
Bob Trotta, owner
Medical Claims Billing
25Kclaims processed monthly
provider showing patient experience software
Billing
“Tebra has the best support of any system that I’ve seen over the years. Their team members are knowledgeable, patient, and stay with you until issues are resolved.”
Cynthia Dane
Medical biller for Dr. Mary Lee, MD
98%AR collections rate
Beat medical billing challenges in 2026
Access updated checklists, clearer processes, and new automation ideas to simplify your workflows, stay compliant, and strengthen your revenue in 2026.
Get the free workbook

2. Insist on a business associate agreement (BAA) for any vendor using AI

A BAA is a necessary HIPAA contract between a healthcare organization and any vendor handling PHI. This includes vendors providing AI tools for coding, documentation, chatbots, analytics, or automation. Once the BAA is signed, the AI vendor is legally obligated to safeguard PHI, and the vendor can be directly liable for HIPAA violations.

“If a vendor doesn’t include a BAA in their terms or license agreement, that’s a red flag,” said Heckman. “You need partners who understand healthcare compliance and not just AI.”

More specifically, the BAA should clearly define PHI and its permitted uses as well as unauthorized secondary uses of PHI. For example, providers may want to ensure the BAA specifies that the vendor cannot: 

  • Retain PHI after the task is complete
  • Use PHI to improve AI models
  • Commingle PHI with other customer data, or 
  • Store PHI in logs

3. Use AI for research, not decision-making

When thinking about how to use AI in medical coding and billing, Heckman provided this advice: “AI tools can be useful for research like staying current on regulatory changes or market trends, but don’t rely on them to make coding or billing decisions. The technology is still learning, and it can misinterpret context.”

"AI tools can be useful for research like staying current on regulatory changes or market trends, but don’t rely on them to make coding or billing decisions."
Aimee Heckman
Healthcare business consultant
Aimee Heckman for Tebra's The Intake

For example, AI coding tools might treat diagnostic mentions as confirmed diagnoses or overlook qualifiers, such as:

  • Concern for 
  • History of
  • Likely
  • Rule out
  • Versus

4. Ensure risk assessments and policies are up to date

“If your BAA hasn’t been updated in 10 years, now’s the time to do that,” said Heckman. 

For example, a coding or billing vendor might introduce AI features after contract execution. In that case, it’s important to update the BAA to ensure compliance with the HIPAA Security Rule

Similarly, performing regular security risk assessments (with a minimum frequency of annually) is critical as well. “If it’s the first time you’ve done it, it may be worth having a professional help you out just to make sure you get a baseline, and then each year, it’s just updating what might have changed,” she added. 

For example, with AI coding tools, an updated risk assessment might be necessary for: 

  • Validating new data flow maps
  • Accessing controls, and 
  • Role-based permissions

Find out how Tebra's medical billing software helps you remove the most tedious and error-prone parts from your billing workflow — without sacrificing accuracy. Book a free, personalized demo today.

FAQs

Frequently asked questions

No. Coders should never paste PHI into public or consumer tools. They should only enter information if the medical practice has:
  • Approved the tool
  • A signed BAA is in place, and
  • Coders access it through secure, authorized systems
Yes, coders should validate these codes for accuracy and compliance.
Coders should flag these cases and provide feedback to the AI vendor. Coders must ensure accuracy prior to claim submission.
Refusal to sign a BAA is a major red flag. Unless the tool never creates, receives, maintains, or transmits PHI, a BAA is critical.
Yes, practices can download ONC’s free security risk assessment tool, which is designed to help medium and small providers conduct a risk assessment on their own.

You might also be interested in

Written by

Lisa Eramo, freelance healthcare writer

Lisa A. Eramo, BA, MA is a freelance writer specializing in health information management, medical coding, and regulatory topics. She began her healthcare career as a referral specialist for a well-known cancer center. Lisa went on to work for several years at a healthcare publishing company. She regularly contributes to healthcare publications, websites, and blogs, including the AHIMA Journal. Her focus areas are medical coding, and ICD-10 in particular, clinical documentation improvement, and healthcare quality/efficiency.

Subscribe to The Intake: A weekly check-up for your independent practice