While cybersecurity might not be at the forefront of your priorities as you manage denials, onboard clients, and oversee staff, it becomes increasingly critical to consider its implications as your medical coding and billing teams begin implementing artificial intelligence (AI) solutions.
A recent Gallup survey found about 12% of US employed adults use AI (including chatbots like ChatGPT) daily at work, and roughly 25% use it at least several times per week. And increasingly, medical coding and billing tools incorporate some form of AI innately.
When it comes to protecting patient data, multi-factor authentication, phishing training, and HIPAA policies are critical, but with AI, practices must take it a step further to mitigate risk, said healthcare consultant Aimee Heckman during a recent Tebra webinar, Getting paid: How to capture every dollar in 2026. “When it comes to AI and HIPAA, we need to proceed with caution, especially for anyone experimenting with AI tools like ChatGPT, Grok, or Gemini,” she warned.
Read on for expert advice and best practices regarding how to use AI in medical coding and billing.
Ensuring HIPAA compliance in a new world of AI
During the webinar, Heckman provided important questions to ask before trusting AI vendors and also explained how to use AI in medical coding and billing without violating HIPAA. Following are 4 of her recommendations:
1. Never enter protected health information (PHI) into public platforms
PHI includes things like names, medical record numbers, dates of service, dates of birth, insurance information, and any other information that someone could use to reasonably identify a patient. “If you’re exploring what they can do, only use de-identified data,” she said.
"If you’re exploring what [public AI platforms] can do, only use de-identified data."

When thinking about how to use AI in medical coding and billing, best practice is to rewrite notes as hypotheticals or remove all identifiers before pasting anything into Standard ChatGPT and similar public tools.
Standard ChatGPT is not HIPAA-compliant, meaning anything billers enter could be stored or processed in ways that are not permitted under HIPAA. There’s also no legal protection for PHI once a biller enters it into the chatbot, and practices could be liable for improper disclosure.
What Our Customers Are Saying
All case studies


2. Insist on a business associate agreement (BAA) for any vendor using AI
A BAA is a necessary HIPAA contract between a healthcare organization and any vendor handling PHI. This includes vendors providing AI tools for coding, documentation, chatbots, analytics, or automation. Once the BAA is signed, the AI vendor is legally obligated to safeguard PHI, and the vendor can be directly liable for HIPAA violations.
“If a vendor doesn’t include a BAA in their terms or license agreement, that’s a red flag,” said Heckman. “You need partners who understand healthcare compliance and not just AI.”
More specifically, the BAA should clearly define PHI and its permitted uses as well as unauthorized secondary uses of PHI. For example, providers may want to ensure the BAA specifies that the vendor cannot:
- Retain PHI after the task is complete
- Use PHI to improve AI models
- Commingle PHI with other customer data, or
- Store PHI in logs
3. Use AI for research, not decision-making
When thinking about how to use AI in medical coding and billing, Heckman provided this advice: “AI tools can be useful for research like staying current on regulatory changes or market trends, but don’t rely on them to make coding or billing decisions. The technology is still learning, and it can misinterpret context.”
"AI tools can be useful for research like staying current on regulatory changes or market trends, but don’t rely on them to make coding or billing decisions."

For example, AI coding tools might treat diagnostic mentions as confirmed diagnoses or overlook qualifiers, such as:
- Concern for
- History of
- Likely
- Rule out
- Versus
4. Ensure risk assessments and policies are up to date
“If your BAA hasn’t been updated in 10 years, now’s the time to do that,” said Heckman.
For example, a coding or billing vendor might introduce AI features after contract execution. In that case, it’s important to update the BAA to ensure compliance with the HIPAA Security Rule.
Similarly, performing regular security risk assessments (with a minimum frequency of annually) is critical as well. “If it’s the first time you’ve done it, it may be worth having a professional help you out just to make sure you get a baseline, and then each year, it’s just updating what might have changed,” she added.
For example, with AI coding tools, an updated risk assessment might be necessary for:
- Validating new data flow maps
- Accessing controls, and
- Role-based permissions
Find out how Tebra's medical billing software helps you remove the most tedious and error-prone parts from your billing workflow — without sacrificing accuracy. Book a free, personalized demo today.
FAQs
Frequently asked questions
- Approved the tool
- A signed BAA is in place, and
- Coders access it through secure, authorized systems
You might also be interested in
- Free briefing: How billers navigate AI, cybersecurity, and compliance — confidently
- AI in medical coding and billing: Ask these questions before trusting AI vendors
- Medical billing strategies to stay resilient through AI, audits, and cyberattacks







