HIPAA AI Compliance: What Healthcare Businesses Need Before Deploying AI
There is a pattern I see repeatedly in healthcare and healthcare-adjacent businesses. They find an AI tool they like — a scheduling assistant, an automated patient follow-up system, a clinical note generator — and they start using it. The tool works well. Staff love it.
Then someone asks: did we sign a BAA with that vendor?
Usually the answer is no.
I hold a CISSP and CISM and work daily as an Information System Security Manager in a DoD environment. I also consult with dental practices, therapy offices, home health agencies, and other small healthcare businesses on AI automation. In that work, the HIPAA conversation comes up constantly — and it is almost always happening after a tool is already in production, not before.
This guide covers what HIPAA actually requires when you deploy AI, what the practical risks are for a small healthcare business, and how to evaluate AI tools before you adopt them — not after.
What Makes AI Tools a HIPAA Problem
HIPAA regulates the use and disclosure of Protected Health Information (PHI). PHI is any information that relates to an individual’s past, present, or future health condition, treatment, or payment — combined with a direct or indirect identifier. Name plus appointment date is PHI. Phone number plus the fact that someone called your practice is PHI. You would be surprised how broadly the definition extends.
When you deploy an AI tool in a healthcare environment, you need to answer three questions before any patient data touches that system:
Does this tool receive PHI? If your AI scheduling assistant imports your patient calendar to find open slots, it receives patient names, contact info, and appointment types — all of which are PHI. If your AI clinical note tool listens to provider-patient conversations, it receives an enormous volume of PHI.
Is the AI vendor a Business Associate? Under HIPAA, any third party that creates, receives, maintains, or transmits PHI on your behalf is a Business Associate. This includes cloud vendors, software vendors, and AI tool providers. Business Associates must sign a Business Associate Agreement (BAA) before any PHI flows to them.
Does the vendor’s BAA actually cover the AI use case? Many vendors offer generic BAAs that were drafted before AI became a factor. They may not address what happens to PHI used during model inference, whether conversation logs are retained, or how PHI is handled in training pipelines.
The Business Associate Agreement: What to Look For
A BAA is not just a checkbox. It is a legal document that defines your vendor’s obligations regarding your patients’ data. Before signing, verify that the BAA addresses:
Permitted uses and disclosures. The BAA should explicitly describe what the vendor is allowed to do with PHI. Using it for the contracted service is fine. Using it to train models, selling aggregate analytics, or sharing with subprocessors should require specific authorization.
Subprocessors. AI vendors almost always rely on downstream providers — cloud infrastructure, model APIs, data pipelines. Your vendor’s BAA should require them to impose equivalent obligations on their subprocessors. Ask for a list of subprocessors and confirm each is under a BAA.
Breach notification. HIPAA requires Business Associates to notify covered entities of breaches within 60 days of discovery. Your BAA should specify notification timelines and procedures. Sixty days is the legal maximum — many well-run vendors commit to faster timelines.
Data deletion. What happens to PHI when you terminate the relationship? The BAA should specify that the vendor will return or destroy all PHI. “Destroy” needs to include their backup systems and any cached data from your account.
AI training opt-out. This is the new one. Most legacy BAA templates do not address whether PHI is used in model training. Get explicit written confirmation — in the BAA, not just a verbal assurance — that your patient data is not used to train or fine-tune models. Many AI vendors will confirm this if you ask; fewer include it in the standard BAA without prompting.
The Access Control Requirement
HIPAA requires that you implement policies and procedures to restrict access to PHI to those who need it to perform their job. This is the Minimum Necessary standard.
When you configure an AI tool, think through what data the AI actually needs to do its job. An automated appointment reminder needs the patient’s first name, phone number or email, and appointment time. It does not need their date of birth, insurance information, diagnosis codes, or clinical notes. Configure the integration to pass only what is necessary.
This is not just good security practice — it is a HIPAA requirement. If your AI vendor requires you to export your entire patient database to use their service, that is a flag. A well-designed system accesses the minimum data needed for each transaction.
Audit Logging and the Breach Response Plan
HIPAA requires covered entities to implement hardware, software, and procedural mechanisms to record and examine access to systems containing PHI. In practice, this means you need logs that show:
- Which AI system accessed which data and when
- What data was accessed or transmitted
- Whether the access was authorized
Most modern AI tools and cloud platforms generate these logs by default. The question is whether you are collecting them, retaining them for the required period (six years under HIPAA), and can actually retrieve them when you need to.
Beyond logging, you need a written breach response plan before something goes wrong. If your AI vendor notifies you that their system was compromised and patient data was potentially exposed, you need a documented process. Who gets notified internally? Who calls the vendor for details? What is your timeline for notifying affected patients? When does your legal counsel get involved? What goes to HHS?
A practice that has thought through these questions in advance handles breaches measurably better than one improvising on the spot.
What to Ask Before Deploying Any AI Tool
Before you sign with an AI vendor and let any patient data touch their system, work through this checklist:
- Will this tool receive, transmit, store, or process PHI? (Be honest — if in doubt, assume yes)
- Is the vendor willing to sign a BAA? (No BAA = no PHI, full stop)
- Does the BAA explicitly address AI-specific use cases including training data and model inference logs?
- Can you verify the list of subprocessors and confirm each operates under a BAA?
- What encryption standards are used for PHI in transit and at rest?
- Can the vendor produce evidence of a security audit or penetration test within the past 12 months?
- What is the breach notification timeline in the BAA?
- What is the data deletion process when you terminate service?
- Does the tool follow minimum necessary principles — does it access only the data it needs?
- Do you have a documented breach response plan that accounts for third-party vendor incidents?
The Practical Reality for Small Practices
Most small healthcare businesses are not the primary target of sophisticated cyberattacks. But they are targeted by opportunistic attacks, and they face regulatory enforcement for compliance failures. The penalties for HIPAA violations are tiered — a breach resulting from willful neglect of basic requirements carries minimum penalties of $50,000 per violation, with no cap below $1.9 million per year for repeated violations in the same category.
More practically: if you have a breach and HHS investigates, the first thing they will look at is whether you had a BAA with the vendor involved. If you did not, that is a clear finding. If you did but you had not reviewed it to confirm it covered the AI use case, that is a more defensible position — but you still need to demonstrate you exercised reasonable oversight.
The cost of doing this right up front is a few hours of work and a conversation with your AI vendors. The cost of not doing it is significantly higher.
I put together the AI Compliance Toolkit specifically for small healthcare and healthcare-adjacent businesses deploying AI tools. It includes a BAA review checklist, a vendor security questionnaire template, a minimum necessary data mapping worksheet, and a breach response plan template you can customize for your practice. If you are adopting AI tools and want to get the compliance structure right before you deploy, that is a practical place to start.