Microsoft Copilot
Is Microsoft Copilot HIPAA Compliant for Medical Clinics?
Microsoft 365 Copilot can be covered under a BAA for enterprise customers with the right configuration. Consumer Copilot and Copilot in Bing are not HIPAA compliant. The distinction between product variants matters significantly for clinics.
Short answer
Microsoft Copilot spans multiple products with different compliance profiles. Microsoft 365 Copilot, integrated into the enterprise Microsoft 365 suite, can be covered under Microsoft's HIPAA BAA for Microsoft 365 services when the organization has the appropriate enterprise configuration and has disabled data-training settings. Consumer-facing Copilot products — Copilot in Bing, Copilot.microsoft.com in its consumer mode, and the free Copilot app — are not covered under a healthcare BAA and must not be used with PHI.
Verdict: Conditional for M365 Enterprise; No for consumer products
Microsoft Copilot is not a single product — it is a brand applied to several AI experiences with very different compliance profiles. Getting the answer right requires knowing which Copilot the clinic is using.
Microsoft 365 Copilot, the enterprise product integrated into M365 apps, can be in scope under Microsoft’s BAA for healthcare organizations on qualifying enterprise plans. The free Copilot in Bing, Copilot.microsoft.com in its consumer mode, and the standalone Copilot app are consumer products with no healthcare BAA coverage.
Consumer Copilot: a hard no
The free or consumer-tier Copilot experiences — accessed through Bing, Windows, or a personal Microsoft account — operate under consumer terms of service. Microsoft does not offer a BAA for these products. Submitting any patient information through a consumer Copilot surface is a HIPAA violation regardless of intent.
GitHub Copilot: not a PHI tool
GitHub Copilot is a separate Microsoft product designed for software development assistance. It is not intended for clinical or administrative use and has no healthcare BAA pathway for PHI. Clinic staff should not use GitHub Copilot to process patient data. This guide does not address GitHub Copilot further because it operates in a code context, not a healthcare workflow context.
Microsoft 365 Copilot in an enterprise environment
Microsoft 365 Copilot is the AI layer that integrates with Word, Excel, Outlook, Teams, and other M365 apps. For organizations on qualifying enterprise M365 plans with an active Microsoft healthcare BAA, Microsoft 365 Copilot can be in scope.
Microsoft’s compliance documentation confirms that for covered healthcare customers, Microsoft 365 services — including certain Copilot features — are covered under the Microsoft Online Services Terms and the associated BAA. Clinics must:
- Confirm they are on an eligible M365 enterprise plan (typically E3 or E5, or equivalent with Copilot add-on)
- Have an executed Business Associate Agreement with Microsoft
- Ensure the tenant is configured to prevent data from being used for model training
AI training, data use, and PHI coverage
Three questions a clinic must answer before any staff member uses a Copilot feature:
(a) Is AI training on your data on by default? For consumer Copilot (Copilot in Bing, the free Copilot app, personal Microsoft accounts), Microsoft’s consumer terms permit use of interactions to improve AI services — and there is no BAA path to make this acceptable for PHI. For enterprise M365 commercial tenants, Microsoft’s terms commit that customer data is not used to train foundation AI models. This is a default protection for commercial accounts that distinguishes them from consumer accounts.
(b) How to confirm and verify it? In the Microsoft 365 Admin Center, confirm that the tenant is a commercial enterprise account (not a consumer or education tenant). Review the Microsoft Online Services Data Protection Addendum, which documents the no-training commitment for commercial data. If the organization has customized data-processing settings, confirm through Microsoft’s compliance portal that AI model training is not enabled. Unlike some vendors, Microsoft does not require a separate admin toggle for the training-off protection in standard commercial M365 — but hybrid or mixed-mode tenants should verify their specific configuration.
(c) Are prompts containing PHI covered by the BAA? Prompts submitted through Microsoft 365 Copilot in Word, Excel, Outlook, Teams, and other covered M365 services are within the scope of the Microsoft healthcare BAA for organizations on qualifying enterprise plans with an active BAA. Prompts submitted through consumer Copilot surfaces — including Bing, the personal Copilot app, or copilot.microsoft.com accessed with a personal Microsoft account — are not covered, even if the user’s device is otherwise managed by the organization.
What staff must understand
Even in a compliant M365 environment with Copilot in scope:
- Staff cannot paste patient records into a Copilot prompt and assume the output is subject to the same access controls as the source record
- Copilot-generated summaries stored in M365 apps become ePHI and carry the same access requirements
- Consumer Copilot on personal devices — including personal Windows laptops — is not covered and must be addressed in policy
Parallel AI tool guidance
The Microsoft Copilot question parallels the Google Gemini question for Workspace customers. Both are enterprise AI tools with BAA availability in specific configurations and consumer versions that are off-limits. See Is Google Gemini HIPAA compliant? for the Gemini side of this comparison.
PHIGuard commercial baseline
PHIGuard uses flat per-clinic pricing rather than per-user fees. A Business Associate Agreement is included on every public plan. The primary trial path is a 30-day free trial with no credit card required. See current PHIGuard pricing for plan names, monthly list prices, annual totals, and current launch details.
Sources
- Microsoft HIPAA Compliance Offering | Microsoft
- Microsoft 365 Copilot Data Privacy | Microsoft
- Business Associate Contracts — HHS Guidance | HHS