Limited-time offer: LAUNCH50 gives 50% off forever. Auto-applied at checkout.See pricing

Microsoft Copilot

Is Microsoft Copilot HIPAA Compliant for Medical Clinics?

Microsoft 365 Copilot can be covered under a BAA for enterprise customers with the right configuration. Consumer Copilot and Copilot in Bing are not HIPAA compliant. The distinction between product variants matters significantly for clinics.

Short answer

Microsoft Copilot spans multiple products with different compliance profiles. Microsoft 365 Copilot, integrated into the enterprise Microsoft 365 suite, can be covered under Microsoft's HIPAA BAA for Microsoft 365 services when the organization has the appropriate enterprise configuration and has disabled data-training settings. Consumer-facing Copilot products — Copilot in Bing, Copilot.microsoft.com in its consumer mode, and the free Copilot app — are not covered under a healthcare BAA and must not be used with PHI.

Verdict: Conditional for M365 Enterprise; No for consumer products

Microsoft Copilot is not a single product — it is a brand applied to several AI experiences with very different compliance profiles. Getting the answer right requires knowing which Copilot the clinic is using.

Microsoft 365 Copilot, the enterprise product integrated into M365 apps, can be in scope under Microsoft’s BAA for healthcare organizations on qualifying enterprise plans. The free Copilot in Bing, Copilot.microsoft.com in its consumer mode, and the standalone Copilot app are consumer products with no healthcare BAA coverage.

Consumer Copilot: a hard no

The free or consumer-tier Copilot experiences — accessed through Bing, Windows, or a personal Microsoft account — operate under consumer terms of service. Microsoft does not offer a BAA for these products. Submitting any patient information through a consumer Copilot surface is a HIPAA violation regardless of intent.

GitHub Copilot: not a PHI tool

GitHub Copilot is a separate Microsoft product designed for software development assistance. It is not intended for clinical or administrative use and has no healthcare BAA pathway for PHI. Clinic staff should not use GitHub Copilot to process patient data. This guide does not address GitHub Copilot further because it operates in a code context, not a healthcare workflow context.

Microsoft 365 Copilot in an enterprise environment

Microsoft 365 Copilot is the AI layer that integrates with Word, Excel, Outlook, Teams, and other M365 apps. For organizations on qualifying enterprise M365 plans with an active Microsoft healthcare BAA, Microsoft 365 Copilot can be in scope.

Microsoft’s compliance documentation confirms that for covered healthcare customers, Microsoft 365 services — including certain Copilot features — are covered under the Microsoft Online Services Terms and the associated BAA. Clinics must:

  1. Confirm they are on an eligible M365 enterprise plan (typically E3 or E5, or equivalent with Copilot add-on)
  2. Have an executed Business Associate Agreement with Microsoft
  3. Ensure the tenant is configured to prevent data from being used for model training

AI training, data use, and PHI coverage

Three questions a clinic must answer before any staff member uses a Copilot feature:

(a) Is AI training on your data on by default? For consumer Copilot (Copilot in Bing, the free Copilot app, personal Microsoft accounts), Microsoft’s consumer terms permit use of interactions to improve AI services — and there is no BAA path to make this acceptable for PHI. For enterprise M365 commercial tenants, Microsoft’s terms commit that customer data is not used to train foundation AI models. This is a default protection for commercial accounts that distinguishes them from consumer accounts.

(b) How to confirm and verify it? In the Microsoft 365 Admin Center, confirm that the tenant is a commercial enterprise account (not a consumer or education tenant). Review the Microsoft Online Services Data Protection Addendum, which documents the no-training commitment for commercial data. If the organization has customized data-processing settings, confirm through Microsoft’s compliance portal that AI model training is not enabled. Unlike some vendors, Microsoft does not require a separate admin toggle for the training-off protection in standard commercial M365 — but hybrid or mixed-mode tenants should verify their specific configuration.

(c) Are prompts containing PHI covered by the BAA? Prompts submitted through Microsoft 365 Copilot in Word, Excel, Outlook, Teams, and other covered M365 services are within the scope of the Microsoft healthcare BAA for organizations on qualifying enterprise plans with an active BAA. Prompts submitted through consumer Copilot surfaces — including Bing, the personal Copilot app, or copilot.microsoft.com accessed with a personal Microsoft account — are not covered, even if the user’s device is otherwise managed by the organization.

What staff must understand

Even in a compliant M365 environment with Copilot in scope:

  • Staff cannot paste patient records into a Copilot prompt and assume the output is subject to the same access controls as the source record
  • Copilot-generated summaries stored in M365 apps become ePHI and carry the same access requirements
  • Consumer Copilot on personal devices — including personal Windows laptops — is not covered and must be addressed in policy

Parallel AI tool guidance

The Microsoft Copilot question parallels the Google Gemini question for Workspace customers. Both are enterprise AI tools with BAA availability in specific configurations and consumer versions that are off-limits. See Is Google Gemini HIPAA compliant? for the Gemini side of this comparison.

PHIGuard commercial baseline

PHIGuard uses flat per-clinic pricing rather than per-user fees. A Business Associate Agreement is included on every public plan. The primary trial path is a 30-day free trial with no credit card required. See current PHIGuard pricing for plan names, monthly list prices, annual totals, and current launch details.

FAQ

Questions clinics ask before using this software with PHI

Is Copilot in Microsoft Teams HIPAA compliant?

Copilot in Teams is part of Microsoft 365 Copilot. If the organization has a qualifying M365 enterprise plan with an executed Microsoft BAA and correct compliance configuration, Copilot in Teams may be covered. Organizations on consumer or lower-tier plans are not covered.

What is the difference between Microsoft 365 Copilot and Copilot in Bing?

Microsoft 365 Copilot is an enterprise product integrated into the M365 productivity suite. Copilot in Bing is a consumer-facing AI assistant with no healthcare contractual protections. Do not submit PHI to Bing-based or consumer Copilot surfaces under any circumstances.

Does Microsoft's BAA cover Copilot's ability to read documents and emails?

Microsoft 365 Copilot's features — including reading emails, meeting transcripts, and documents from M365 apps — can be in scope under the BAA for qualifying enterprise plans. The key requirement is that the M365 tenant is configured for HIPAA compliance and the BAA is active.

What if a staff member uses their personal Microsoft account for Copilot?

Personal Microsoft accounts are consumer accounts and are not covered under any enterprise BAA. A staff member using personal-account Copilot to process work-related PHI creates an unauthorized disclosure. Clinics need a policy prohibiting PHI use on consumer AI tools.

Operational assurance

Turn vendor research into a system your clinic can actually run.

PHIGuard gives small clinics a BAA-ready operating layer, recurring compliance work, and a safer home for patient-adjacent tasks.

BAA included Legal baseline available on every plan.
Audit history Compliance actions stay reviewable later.
No card upfront Start evaluation before billing setup.

No credit card required. Add billing details later if you want service to continue after the trial.