Limited-time offer: LAUNCH50 gives 50% off forever. Auto-applied at checkout.See pricing

ChatGPT / OpenAI

Is ChatGPT HIPAA Compliant for Medical Clinics?

What small clinics must know about ChatGPT's BAA availability, consumer versus enterprise tiers, training data use, and the compliance risk of staff using AI tools with patient information.

Short answer

Consumer ChatGPT accounts — including ChatGPT Free, Plus, and Team — have no BAA available and no HIPAA coverage. OpenAI offers a BAA through ChatGPT Enterprise and qualifying API enterprise agreements; these must be negotiated directly with OpenAI's sales team. Clinic staff using consumer ChatGPT with patient information are creating a reportable exposure. Even with an Enterprise BAA, the clinic must configure data controls and understand what the agreement does and does not cover.

Short answer

ChatGPT is not HIPAA compliant on consumer plans. OpenAI offers a BAA through ChatGPT Enterprise, which changes the data handling terms and provides contractual coverage. Without an Enterprise agreement, any clinic staff member entering PHI into ChatGPT — including through free or Plus accounts — is creating an unprotected disclosure. This is one of the most common unacknowledged compliance risks in small clinic operations today.

BAA availability

OpenAI provides a HIPAA-eligible Business Associate Agreement through ChatGPT Enterprise and through qualifying API enterprise agreements. The following plans have no BAA path:

  • ChatGPT Free
  • ChatGPT Plus
  • ChatGPT Team

The Enterprise plan requires direct engagement with OpenAI’s sales team. Pricing is not published on OpenAI’s website. The BAA covers the ChatGPT Enterprise product and the specific API usage covered under the enterprise agreement; it does not automatically extend to all OpenAI products or to consumer API usage.

The training data risk on consumer tiers

Consumer ChatGPT accounts (Free and Plus) include a setting that allows users to opt out of model training. However, the default behavior — and the behavior of staff who have not reviewed their account settings — is that prompts may be used. A patient’s name, diagnosis, or treatment detail entered into a free ChatGPT session is potentially being processed by OpenAI’s systems in ways the clinic cannot audit or retrieve.

ChatGPT Enterprise’s data terms are different: OpenAI states that Enterprise prompt data is not used for training by default. Confirm the current terms in OpenAI’s Enterprise Privacy documentation before relying on this for compliance purposes.

What the Enterprise BAA covers and does not cover

Assuming the clinic has executed a ChatGPT Enterprise BAA, the agreement covers the ChatGPT Enterprise service. It does not:

  • Cover personal OpenAI accounts staff may use at home or on personal devices
  • Cover third-party applications built on the OpenAI API unless those vendors have their own BAA with you
  • Eliminate the clinic’s responsibility to conduct a workforce training and AI use policy
  • Remove the need for a risk assessment of AI use in patient-adjacent workflows

Staff use of consumer AI is an active risk

The most common real-world compliance problem with ChatGPT at small clinics is not enterprise deployment — it is staff members using their personal or free-tier ChatGPT accounts for work tasks. Drafting patient correspondence, summarizing visit notes, or generating prior authorization letters through a consumer account exposes PHI without any contractual protection.

Addressing this requires:

  1. A written workforce policy that prohibits use of non-approved AI tools for any task involving patient information
  2. Training at onboarding and annually thereafter
  3. A process for approving new AI tools before staff adoption

What not to enter into ChatGPT even with an Enterprise BAA

Even under a compliant Enterprise deployment, certain practices carry risk:

  • Do not enter patient names combined with diagnoses, treatment plans, or test results unless the workflow requires it and access controls are in place
  • Do not store ChatGPT outputs containing PHI outside of a HIPAA-covered system
  • Do not allow staff to copy ChatGPT-generated text into external systems without verifying those systems are also BAA-covered

When AI tools require a broader compliance program

For similar analyses of competing AI tools, see is Claude HIPAA compliant, is Anthropic HIPAA compliant, is Perplexity HIPAA compliant, and is DeepSeek HIPAA compliant.

PHIGuard commercial baseline

PHIGuard uses flat per-clinic pricing rather than per-user fees. A Business Associate Agreement is included on every public plan. The primary trial path is a 30-day free trial with no credit card required. See current PHIGuard pricing for plan names, monthly list prices, annual totals, and current launch details.

FAQ

Questions clinics ask before using this software with PHI

Can a clinic staff member use free ChatGPT to draft a patient letter?

Not if the letter contains any PHI. Free ChatGPT has no BAA. Entering patient names, dates of service, diagnoses, or any other PHI into a free ChatGPT prompt is a potential breach.

Does OpenAI use my prompts to train its models?

For consumer accounts (Free, Plus), prompts may be used for model training unless the user opts out in settings. ChatGPT Enterprise has separate data terms that exclude prompt data from training by default — verify with the current OpenAI Enterprise Privacy documentation.

Is the ChatGPT Enterprise BAA sufficient for clinical use?

A signed BAA is necessary but not sufficient. The clinic must also conduct a risk assessment, implement a workforce AI use policy, and ensure that only authorized staff access PHI-adjacent AI tools.

What about OpenAI's API used in a third-party health app?

If a vendor builds a product on OpenAI's API and that product processes PHI, the vendor is a business associate and must provide you with a BAA. OpenAI's API has its own BAA terms — confirm with the application vendor what agreement is in place.

Operational assurance

Turn vendor research into a system your clinic can actually run.

PHIGuard gives small clinics a BAA-ready operating layer, recurring compliance work, and a safer home for patient-adjacent tasks.

BAA included Legal baseline available on every plan.
Audit history Compliance actions stay reviewable later.
No card upfront Start evaluation before billing setup.

No credit card required. Add billing details later if you want service to continue after the trial.