AI can be used in HIPAA-compliant businesses — dental offices, medical practices, therapy clinics, health-tech startups — but only with the right vendor agreements and technical configuration. The key requirement is a Business Associate Agreement (BAA) between your practice and the AI vendor. Without a signed BAA, using any AI tool to process protected health information (PHI) is a HIPAA violation, regardless of how “secure” the vendor claims to be. As of early 2026, only a handful of AI providers offer HIPAA-eligible services with BAAs, and the configuration requirements go beyond simply signing up for a business plan.
Key Takeaways
- A signed Business Associate Agreement (BAA) is required before any AI tool touches PHI — no exceptions
- OpenAI (Enterprise/API), Anthropic (API), Google Cloud, and Microsoft Azure offer HIPAA-eligible AI services with BAAs
- Consumer-facing AI products (ChatGPT free/Plus, Gemini, free Copilot) are never HIPAA-compliant
- HIPAA violations carry fines from $141 to $2.13 million per violation category per year, with criminal penalties for willful neglect
- Dental offices and medical practices can use AI safely for scheduling, documentation, and administrative tasks with proper setup
Which AI Tools Are HIPAA-Compliant (and Which Aren’t)
Let’s be specific. HIPAA compliance isn’t a feature you toggle on — it’s a combination of a BAA, technical safeguards, and operational procedures. Here’s the current landscape.
AI Providers That Offer BAAs
| Provider | HIPAA-Eligible Product | BAA Available? | PHI Processing? | Minimum Plan |
|---|---|---|---|---|
| OpenAI | ChatGPT Enterprise, API (with eligible endpoints) | Yes | Yes (with BAA) | Enterprise or API |
| Anthropic | Claude API (not consumer chat) | Yes | Yes (with BAA) | API access |
| Google Cloud | Vertex AI, Healthcare API | Yes | Yes (with BAA) | Google Cloud paid |
| Microsoft | Azure OpenAI Service, Azure AI Services | Yes | Yes (with BAA) | Azure Enterprise |
| Amazon | AWS Bedrock, Amazon Comprehend Medical | Yes | Yes (with BAA) | AWS paid |
AI Products That Are NOT HIPAA-Compliant
- ChatGPT Free, Plus, or Team — OpenAI does not offer BAAs for these plans. Do not enter PHI.
- Google Gemini (consumer) — The free Gemini chat product is not covered by Google’s Cloud BAA.
- Microsoft Copilot (free) — Only Azure-based AI services are covered by Microsoft’s BAA.
- Claude Free/Pro — Anthropic’s consumer chat products do not come with BAAs.
- Any AI chatbot, plugin, or tool that doesn’t explicitly offer a BAA — If a vendor can’t produce a signed BAA, their product is off-limits for PHI.
The pattern is clear: consumer-tier products from even the most reputable AI companies are not HIPAA-compliant. HIPAA compliance requires enterprise or cloud-platform tiers with explicit BAAs.
BAA Requirements: What They Cover and Why They Matter
A Business Associate Agreement is a contract between a HIPAA-covered entity (your practice) and a business associate (the AI vendor) that specifies how PHI will be protected.
What a BAA Must Include
- Permitted uses and disclosures — Exactly how the vendor can use PHI (processing your requests) and what they cannot do with it (sell it, train models on it, share with third parties).
- Safeguard requirements — The vendor’s obligation to implement administrative, physical, and technical safeguards for PHI.
- Breach notification procedures — The vendor must notify you of any breach of unsecured PHI within 60 days (many BAAs specify shorter windows).
- Subcontractor requirements — If the vendor uses sub-processors (cloud hosting, content filtering), those sub-processors must also be bound by BAA terms.
- Return or destruction of PHI — What happens to PHI when the agreement ends.
- HHS audit cooperation — The vendor must make their practices available for compliance audits.
What a BAA Does NOT Do
A BAA doesn’t make a product HIPAA-compliant by itself. It’s a legal prerequisite, not a technical guarantee. You still need proper access controls, encryption, audit logging, and workforce training on your end. The BAA ensures the vendor is legally accountable — the technical implementation ensures PHI is actually protected.
What PHI Can and Cannot Be Processed by AI
Even with a BAA in place, not all PHI processing is appropriate for AI tools. Here’s a practical framework.
Generally Appropriate (with BAA and proper controls)
- Appointment scheduling and reminders — Patient names and contact information for scheduling workflows.
- Clinical documentation assistance — AI-assisted note-taking during patient encounters (dictation, structured note generation).
- Billing and coding support — AI assistance with CPT/ICD coding from clinical notes.
- Administrative communications — Drafting patient communications, insurance correspondence.
- De-identified data analysis — Population health trends, practice analytics on properly de-identified datasets.
Requires Extra Caution
- Clinical decision support — AI suggestions for treatment plans or diagnoses. Valuable but requires clear documentation that AI outputs are suggestions only, reviewed and approved by licensed practitioners.
- Patient-facing AI chatbots — Must be clearly identified as AI, cannot provide medical advice, and must route to human staff for clinical questions.
- Mental health records — Psychotherapy notes have additional protections under HIPAA beyond standard PHI. Extra restrictions apply.
Not Appropriate for AI Processing
- Substance abuse treatment records — Protected under 42 CFR Part 2, which has stricter requirements than HIPAA. Most AI vendors’ BAAs do not cover Part 2 compliance.
- Any PHI on non-BAA platforms — No matter how useful the tool, no BAA means no PHI. Full stop.
Practical Setup for Dental Offices and Medical Practices
Here’s what a HIPAA-compliant AI deployment actually looks like for a typical small healthcare practice.
Step 1: Choose a HIPAA-Eligible AI Platform
For most small practices, Microsoft Azure OpenAI Service or OpenAI Enterprise are the most practical options. Azure integrates with existing Microsoft 365 environments (which many practices already use), and OpenAI Enterprise provides a familiar ChatGPT-like interface with HIPAA protections. Microsoft Copilot for Microsoft 365 is another option if your practice already uses the Microsoft ecosystem.
Step 2: Sign the BAA
Contact the vendor’s sales or compliance team to initiate the BAA process. For Azure, this is typically handled through your Microsoft enterprise agreement. For OpenAI Enterprise, it’s part of the enterprise onboarding. Do not begin processing PHI until the BAA is signed by both parties. Keep the signed BAA in your compliance documentation.
Step 3: Configure Technical Safeguards
- Encryption — Verify data is encrypted in transit (TLS 1.2+) and at rest (AES-256). Most enterprise AI platforms handle this by default, but verify.
- Access controls — Set up role-based access. Front desk staff may need scheduling AI access. Clinical staff may need documentation AI access. Not everyone needs both.
- Audit logging — Enable logging of all AI interactions that involve PHI. You need to know who accessed what, when, in case of a compliance inquiry.
- Automatic session timeout — Configure AI tools to log out after periods of inactivity, especially on shared workstations.
- Data retention limits — Set the minimum retention period necessary. For most AI interactions, 30 days or less is appropriate.
Step 4: Train Your Staff
HIPAA requires workforce training on PHI handling, and AI tools add new scenarios to cover. Key training points:
- Which AI tools are approved for PHI and which are not (this is the most common violation point)
- What types of information can be entered into approved AI tools
- How to verify AI-generated clinical content before it enters the patient record
- What to do if PHI is accidentally entered into a non-approved tool
Step 5: Document Everything
Maintain documentation of: your signed BAAs, your AI-specific policies and procedures, staff training records, access control configurations, and any risk assessments you’ve conducted. This documentation is what you’ll produce if OCR (the HHS Office for Civil Rights) comes asking. Our dental office AI guide covers practice-specific implementation details.
How AI Scale Labs Handles HIPAA Compliance in Deployments
When we deploy AI for healthcare clients, HIPAA compliance is built into the architecture from day one — not bolted on after the fact. Our process includes:
- Vendor assessment and BAA coordination — We evaluate which AI platforms fit the practice’s needs and handle BAA procurement.
- Technical configuration — Encryption, access controls, audit logging, and data retention policies configured to HIPAA specifications.
- Custom AI workflows — Purpose-built AI workflows for clinical documentation, scheduling, and administrative tasks that are designed with PHI boundaries built in. No PHI leaks into non-compliant systems.
- Staff training — Hands-on training sessions covering approved tools, data handling procedures, and incident response.
- Ongoing compliance monitoring — Regular access reviews, policy updates, and vendor BAA verification.
If your practice wants to use AI without becoming a HIPAA compliance expert, schedule a call and we’ll walk through your specific situation.
Common HIPAA Pitfalls With AI
These are the mistakes we see most often when healthcare businesses adopt AI tools.
Using Consumer AI Tools for Patient Data
The most common violation: a dental office manager pastes a patient’s insurance information into ChatGPT’s free tier to draft a pre-authorization letter. No BAA, data potentially used for training, and now the practice has a HIPAA violation. The fix is simple — use an approved, BAA-covered tool — but the mistake happens because the free tool is easier to access.
Assuming Microsoft 365 = HIPAA-Compliant AI
Having a Microsoft 365 Business plan with a BAA doesn’t automatically make Copilot HIPAA-compliant. The BAA covers specific services, and AI features may have separate terms. Verify that the specific AI capabilities you’re using are within scope of your existing BAA.
Forgetting About Minimum Necessary
HIPAA’s “minimum necessary” standard applies to AI too. If you’re using AI to help with billing, it needs CPT codes and diagnosis codes — it doesn’t need the patient’s full clinical history. Configure AI integrations to access only the data fields necessary for the specific task.
No Incident Response Plan for AI
If an employee accidentally enters PHI into a non-compliant AI tool, your practice needs a documented response: assess the breach scope, notify the vendor, determine if patient notification is required, and report to HHS if necessary. Having this plan written in advance — before an incident occurs — is both a HIPAA best practice and a practical necessity. Our security checklist includes incident response planning.
Ignoring AI in Risk Assessments
HIPAA requires periodic risk assessments. AI tools need to be included in these assessments — they’re a new category of PHI access point. Document the AI tools in use, the PHI they access, the safeguards in place, and the residual risk. This is often missed because AI adoption happens gradually and informally.
Frequently Asked Questions
Can I use ChatGPT in my dental office?
Yes, but only for tasks that don’t involve PHI (marketing copy, general business advice, non-patient communications), unless you’re on ChatGPT Enterprise with a signed BAA. The free, Plus, and Team plans are not HIPAA-compliant. For PHI-related tasks, you need Enterprise or an Azure-based deployment.
What are the penalties for HIPAA violations involving AI?
The same as any HIPAA violation: fines range from $141 per violation (if unaware) to $2.13 million per violation category per year (for willful neglect). Criminal penalties can include up to 10 years in prison for intentional misuse. HHS has increasingly flagged AI-related PHI exposure in enforcement actions since 2024.
Do I need a separate BAA for each AI tool?
Yes. Each AI vendor that processes PHI on your behalf needs its own BAA. If you use Azure OpenAI Service and a separate AI scheduling tool, you need a BAA with Microsoft and a BAA with the scheduling tool vendor. Your existing EHR vendor’s BAA does not cover separate AI tools.
Can AI replace medical transcription services under HIPAA?
AI can assist with medical transcription, but it must be through a HIPAA-compliant platform with a BAA. Products like Azure AI Speech Services, Amazon Transcribe Medical, and Nuance DAX (now Microsoft) are designed for this use case. Consumer speech-to-text tools (Siri, Google Assistant, Otter.ai’s free tier) are not HIPAA-compliant for clinical dictation.
Is de-identified data subject to HIPAA when used with AI?
No. Properly de-identified data under HIPAA’s Safe Harbor or Expert Determination methods is not PHI and can be used with any AI tool. The catch is that de-identification must be thorough — removing all 18 HIPAA identifiers. Simply removing names isn’t enough. Dates, zip codes (below state level), ages over 89, and many other data points must also be removed or generalized.