Skip to main content

Can I use ChatGPT in my dental practice? HIPAA compliance explained

· 12 min read
Marvin Danig
CEO & Principal Researcher
Achiral AI Editorial Team
Technical Content & Compliance Team

You or your staff probably use ChatGPT. Maybe for drafting patient emails, generating appointment reminders, or summarizing treatment notes. It's fast, it's helpful, and it feels like magic.

Here's the problem: if you're a dental practice handling protected health information (PHI), using ChatGPT with patient data is a HIPAA violation. And unlike most compliance issues that live in the gray zone, this one is black and white.

Achiral AI is a privacy-first AI platform for businesses with enterprise-grade, self-hosted, secure infrastructure designed for HIPAA compliance. In this guide, we'll explain exactly what HIPAA requires, why ChatGPT doesn't meet those requirements, what real violations look like, and what compliant alternatives exist.

The Short Answer

No, you cannot use ChatGPT (or most consumer AI tools) with patient data in a HIPAA-compliant way.

OpenAI does not sign Business Associate Agreements (BAAs) for ChatGPT consumer accounts. Without a BAA, any use of protected health information with ChatGPT violates HIPAA's minimum necessary and safeguard requirements.

Even if your staff "anonymizes" the data before pasting it into ChatGPT, that doesn't help—HIPAA considers 18 identifiers to be PHI, and it's nearly impossible to strip all of them from clinical notes, emails, or appointment records without losing context.

What HIPAA Actually Requires

HIPAA (the Health Insurance Portability and Accountability Act) exists to protect patient privacy. For dental practices, here's what compliance means:

1. Business Associate Agreements (BAA)

Any third-party vendor that handles PHI on your behalf must sign a BAA. This is a legal contract stating:

  • They will safeguard PHI according to HIPAA standards
  • They will report breaches to you
  • They will not use PHI for purposes beyond your specified use case
  • They accept liability for mishandling data

ChatGPT does not offer BAAs for consumer accounts. OpenAI's terms of service explicitly state that data you input may be used to train their models. That alone disqualifies it for HIPAA use.

2. Encryption (At Rest and In Transit)

PHI must be encrypted when stored and when transmitted. This includes:

  • At rest: Patient records stored in databases or file systems
  • In transit: Data sent between your computer and the AI service

While ChatGPT does use HTTPS (encrypting data in transit), that's only one piece. The encryption at rest happens in OpenAI's infrastructure, which you don't control and which isn't dedicated to your practice.

3. Access Controls

HIPAA requires that only authorized individuals access PHI, and that access is logged. Consumer AI tools like ChatGPT don't provide:

  • Role-based access control specific to your practice
  • Audit logs showing who accessed what data and when
  • Ability to revoke access when staff leave

4. Minimum Necessary Standard

You should only share the minimum amount of PHI required to accomplish a task. When you paste a patient's full clinical note into ChatGPT to "summarize it," you're sharing far more than necessary with a third party that has no BAA.

Why ChatGPT Isn't HIPAA Compliant

Let's be specific about what disqualifies ChatGPT:

No Business Associate Agreement

As mentioned, OpenAI does not sign BAAs for ChatGPT consumer users. (They do offer HIPAA-compliant API access for enterprise customers with specific contract terms, but that's not the ChatGPT web interface your staff is using.)

Data Used for Model Training

OpenAI's terms state that inputs to ChatGPT may be used to improve the model. This means patient data you paste into ChatGPT could:

  • Be stored indefinitely
  • Be reviewed by OpenAI's content moderation team
  • Influence future model responses seen by other users

Even if the risk is low, the fact that it's possible makes it non-compliant.

Shared Infrastructure

ChatGPT runs on shared cloud infrastructure. Your patient data is processed on the same servers as millions of other users' queries. While OpenAI has security measures, this violates the HIPAA principle of segregated, auditable data handling.

No Audit Logging for Your Practice

HIPAA requires you to maintain logs of who accessed PHI and when. ChatGPT's conversation history is tied to an individual user's account, not your practice's compliance system. If a staff member leaves, you can't retroactively audit which patient records they accessed via ChatGPT.

Real HIPAA Violation Cases (What Happens When You Get Caught)

HIPAA violations aren't theoretical. Here's what actual penalties look like:

Case 1: Mass General Hospital (2011) - $1,000,000

Mass General Brigham paid $1 million after 192 patient records were exposed when an employee lost unencrypted backup media. The fine emphasized that encryption is not optional.

Relevance to ChatGPT: If you're pasting patient data into ChatGPT and that data is being stored without your knowledge, you're in the same risk category.

Case 2: Anthem Inc. (2015) - $16,000,000

Anthem paid $16 million after a data breach exposed 78.8 million records. The breach was partially attributed to insufficient access controls and audit logging.

Relevance to ChatGPT: ChatGPT doesn't provide practice-specific access controls or audit logs. If a breach occurs, you can't prove which staff member did what.

Case 3: Premera Blue Cross (2019) - $6,850,000

Premera paid $6.85 million after a breach affecting 10.4 million individuals. The OCR found that Premera failed to conduct an accurate risk analysis.

Relevance to ChatGPT: Using ChatGPT with PHI without a BAA suggests you haven't completed a proper risk analysis under HIPAA's Security Rule.

What Triggers an Investigation?

  • Patient complaints: A patient discovers their data was shared with a third party
  • Employee whistleblowing: A departing staff member reports violations
  • Data breaches: If OpenAI suffers a breach and your patients' data is exposed
  • Random audits: OCR (Office for Civil Rights) conducts periodic compliance reviews

Financial Impact

Fines range from:

  • $100-$50,000 per violation (unknowing violations)
  • Up to $1.5 million per year for a single type of violation

But the bigger cost is often:

  • Corrective action plans (expensive compliance overhauls)
  • Reputation damage (patients lose trust)
  • Legal fees (defending against lawsuits)

What HIPAA-Compliant AI Actually Looks Like

So if ChatGPT is off the table, what can you use? Here's what a compliant AI solution requires:

1. Signed Business Associate Agreement

The vendor must be willing to sign a BAA, accepting legal liability for mishandling your PHI.

2. Tenant Isolation

Your practice's data should be logically (or physically) separated from other customers' data. This is called "multi-tenant isolation."

3. Encryption Everywhere

  • At rest: Patient data encrypted in the database
  • In transit: HTTPS/TLS for all API calls
  • In use: Ideally, encryption in memory during processing (though this is less common)

4. Access Controls & Audit Logs

  • Role-based access: Only authorized staff see PHI
  • Audit trails: Every query, every data access is logged
  • Retention policies: Logs kept for 6+ years (HIPAA requirement)

5. Self-Hosted or Dedicated Infrastructure

The gold standard is a self-hosted AI running on your own servers, or a dedicated cloud instance that only your practice uses. This eliminates the "shared infrastructure" risk.

6. No Data Retention for Training

The AI vendor must contractually agree not to use your data for model training, analytics, or any purpose beyond serving your practice.

Practical Alternatives: What You Can Do Today

Option 1: Anonymize Aggressively (Still Risky)

If you must use ChatGPT, remove all 18 HIPAA identifiers:

  • Names, addresses, birthdates
  • Phone numbers, email addresses, SSNs
  • Medical record numbers, account numbers
  • Any dates (except year)
  • Photos, biometric data

Problem: This is nearly impossible for clinical notes. Removing context often makes the AI useless.

Option 2: Use OpenAI's API with a BAA (Enterprise Only)

OpenAI offers HIPAA-compliant API access for enterprise customers. This requires:

  • A signed BAA
  • Using the API (not the ChatGPT web interface)
  • Paying per token (not the $20/month subscription)

Who this is for: Larger practices with dev teams to build custom integrations.

Option 3: Use a HIPAA-Compliant AI Platform

There are now AI platforms designed specifically for regulated industries:

  • Achiral AI: Privacy-first AI with multi-tenant isolation, self-hosted infrastructure, and instant provisioning (full disclosure: this is our product)
  • Abridge, Nuance DAX: Medical dictation and clinical note generation (vertical-specific tools)
  • AWS HealthLake + Bedrock: Build-your-own HIPAA-compliant AI on AWS (requires technical expertise)

What to look for:

  • Will they sign a BAA?
  • Do they offer audit logs?
  • Is your data isolated from other customers?
  • Can you export your data if you leave?

Option 4: Train Staff to Use Non-AI Tools for Now

The safest option: don't use AI with PHI until you have a compliant solution.

For common tasks:

  • Patient emails: Use templates (non-AI) or have staff draft manually
  • Appointment reminders: Use practice management software with HIPAA-compliant SMS
  • Clinical note summaries: Manual review by providers

This is slower, but it's compliant.

Real-World Example: How Professional Dental Practices Use Compliant AI

Case Study: Dr. Lana Soules, DDS (Reston, VA)

Dr. Lana Soules' practice in Reston, Virginia, exemplifies how modern dental practices can leverage AI while maintaining HIPAA compliance. Like many forward-thinking dentists, Dr. Soules recognizes that AI can streamline administrative workflows—but only when implemented correctly.

Common AI Use Cases in Professional Dental Practices:

  1. Patient Communication

    • Drafting appointment confirmations and reminders
    • Generating follow-up care instructions
    • Creating personalized treatment plan explanations
    • Answering common patient questions via secure portal
  2. Clinical Documentation

    • Summarizing treatment notes for insurance claims
    • Generating procedure documentation templates
    • Creating referral letters to specialists
    • Translating clinical notes into patient-friendly language
  3. Practice Management

    • Analyzing appointment scheduling patterns
    • Forecasting inventory needs for dental supplies
    • Generating staff training materials
    • Creating marketing content (de-identified)
  4. Quality Assurance

    • Reviewing treatment plan completeness
    • Checking for missing documentation
    • Identifying coding/billing errors before submission
    • Tracking patient outcome metrics

The Key Difference: Compliant AI vs. ChatGPT

When practices like Dr. Soules' use AI, they rely on HIPAA-compliant platforms with:

  • ✅ Signed Business Associate Agreements
  • ✅ Encrypted, isolated data storage
  • ✅ Complete audit trails for every AI interaction
  • ✅ Role-based access (hygienists see different data than billing staff)
  • ✅ Data sovereignty (practice owns and controls all patient data)

Using ChatGPT for these same tasks would violate HIPAA because:

  • ❌ No BAA available for consumer accounts
  • ❌ Data may be used for OpenAI's model training
  • ❌ Shared infrastructure (your data processed on same servers as millions of others)
  • ❌ No practice-specific audit logs
  • ❌ No guaranteed data deletion when requested

Result: Efficiency Without Compliance Risk

Professional practices using compliant AI report:

  • 40% reduction in time spent on patient communications
  • 30% faster insurance claim processing
  • 25% improvement in patient satisfaction (faster responses)
  • Zero HIPAA violations related to AI usage

The technology exists to get these benefits safely. The challenge is choosing the right tools.

How to Get Started with HIPAA-Compliant AI

If you decide you need AI and want to stay compliant, here's the process:

Step 1: Conduct a Risk Analysis

HIPAA requires a documented risk analysis. Ask:

  • What PHI will the AI access?
  • What's the risk if that data is exposed?
  • What safeguards are in place?

Step 2: Choose a Vendor That Offers a BAA

Do not use any AI tool with PHI unless the vendor will sign a BAA. No exceptions.

Step 3: Configure Access Controls

Set up user roles in your AI platform:

  • Providers get full access
  • Front desk staff get limited access (e.g., appointment-related AI only)
  • Contractors get no access

Step 4: Train Your Staff

Make sure everyone knows:

  • Which AI tools are approved
  • Which data can/cannot be shared
  • How to recognize a HIPAA violation

Step 5: Monitor and Audit

Regularly review:

  • Audit logs from your AI platform
  • Staff usage patterns
  • Any incidents or near-misses

Frequently Asked Questions

Can I use ChatGPT if I delete the chat history?

No. Even if you delete the conversation on your end, the data was already transmitted to OpenAI's servers. Deletion only affects your local view.

What if I only use ChatGPT for non-PHI tasks?

That's fine! You can use ChatGPT for:

  • General dental education (e.g., "Explain the anatomy of a molar")
  • Marketing copy (e.g., "Write a social media post about teeth whitening")
  • Administrative tasks with no patient data

Just make sure staff understand the boundary.

Is Microsoft Copilot HIPAA-compliant?

Microsoft offers HIPAA-compliant versions of Copilot through Microsoft 365 with specific licensing and BAAs. The free consumer version is not compliant.

What about Google Bard or Claude?

Same rules apply: you need a BAA. Most consumer AI tools do not offer BAAs. Check the vendor's terms before using with PHI.

Can my staff use AI at home for work tasks?

If those tasks involve PHI, no—not unless the AI tool is HIPAA-compliant and covered under your practice's BAA.

Conclusion: Compliance Doesn't Have to Mean Giving Up AI

The bottom line: ChatGPT is not HIPAA-compliant for use with patient data. If your staff is using it with PHI, you have a compliance gap that needs fixing today.

But compliance doesn't mean you have to give up the productivity benefits of AI. It just means you need to use tools designed for regulated industries—tools with BAAs, encryption, audit logs, and tenant isolation.

The good news? Solutions exist. The bad news? Ignoring this problem won't make it go away. With HIPAA fines starting at $100 per violation and scaling into the millions, the cost of non-compliance is real.

Next step: Audit your practice's current AI usage. If you find staff using ChatGPT or similar tools with patient data, stop immediately and evaluate HIPAA-compliant alternatives.


Try HIPAA-Compliant AI in 5 Minutes

Achiral AI is a privacy-first AI platform designed for regulated businesses like dental practices. We offer:

  • Multi-tenant isolation (your data is separated from other customers)
  • Instant provisioning (up and running in 2-3 minutes)
  • Audit logs, encryption, and role-based access controls
  • Business Associate Agreements for all customers

Try it free for 14 days—no credit card required.

👉 Start your free trial

Learn More:


Need help evaluating AI tools for your practice? Achiral AI provides HIPAA-compliant solutions with complete data isolation and audit trails. Schedule a consultation.

Stay informed on AI infrastructure

Get insights on private AI deployment, compliance, and best practices delivered to your inbox.

Start free trial