Skip to main content

HIPAA Compliance for Software Developers and Healthcare App Teams

Last updated: March 21, 2026

TLDR

Software developers building healthcare applications, APIs, or tools that access PHI are business associates under HIPAA. As a BA, your team must implement the Security Rule's technical safeguards (encryption, access controls, audit logs), sign BAAs with every covered entity client, and document your compliance program. The most common dev-team violations are shipping PHI in application logs, misconfigured cloud storage, and skipping BAAs for internal tooling.

When software developers become business associates

Building healthcare software puts your team under HIPAA whether your lawyers have told you that or not. The trigger is straightforward: if your application creates, receives, maintains, or transmits PHI on behalf of a covered entity, you are a business associate (BA). The covered entity is the clinic, health plan, or hospital. You are their BA.

The HIPAA Omnibus Rule (2013) extended direct liability to business associates. Before 2013, covered entities bore primary responsibility and could argue their vendors were independent. That changed. OCR can now open an enforcement action against a dev team without any involvement from the covered entity client. The fine exposure is real and not hypothetical.

For practical purposes, this means healthcare app developers cannot treat HIPAA compliance as the client’s problem. If your team builds, hosts, or maintains the software, your team owns a compliance program.

What HIPAA requires from dev teams

The Security Rule specifies four categories of technical safeguards. Each has direct implementation consequences for software teams.

Access controls require that PHI is accessible only to users with an authorized purpose. In practice: role-based access control (RBAC), principle of least privilege on IAM policies, no shared credentials between users or environments, and session management that terminates inactive sessions. Audit access to production PHI at the user level.

Audit controls require that you log PHI access and modification. Every read, write, and delete operation touching PHI should generate a log entry with user identity, timestamp, and resource identifier. Those logs must be stored securely, retained per your policies, and reviewed for anomalies. This is where many teams underinvest until an incident forces a reconstruction of what happened.

Integrity controls require that PHI cannot be altered or destroyed without detection. Checksums, versioning, and audit trails on data modifications address this. For most teams, this means database row versioning or an append-only event log for PHI-touching operations.

Transmission security requires encryption of PHI in transit. TLS 1.2 is the current floor; TLS 1.3 is preferred. This applies to internal service-to-service communication too, not just external API calls. PHI moving between microservices over an internal network still requires encryption under HIPAA.

Authentication sits across all of these. Users must be uniquely identified and verified before accessing PHI. MFA is not explicitly required by HIPAA text, but its absence is increasingly difficult to justify in a risk assessment.

The PHI-in-logs problem

This is the violation that bites teams who have done everything else correctly.

Standard application logging captures request bodies, error traces, query strings, and debug output. If any of those fields contain patient names, diagnoses, dates of birth, or other PHI identifiers, the log file is now PHI. Log aggregation services like Datadog, Splunk, or CloudWatch may not have signed BAAs on their standard plans. Log storage buckets may lack access controls appropriate for PHI. Retention periods may be longer than needed, compounding exposure.

The fix is not complicated, but it requires intent. Scrub PHI from log output at the application layer before it hits any log sink. Use patient identifiers (internal IDs, not names or SSNs) in log fields where you need traceability. Review log configurations specifically for PHI exposure before shipping to production.

Misconfigured cloud storage is the second most common gap. S3 buckets, Azure Blob containers, and GCS buckets that store documents, images, or exports containing PHI must be private, access-logged, encrypted at rest, and explicitly reviewed for public access settings. AWS publishes a block public access feature that should be on by default for any bucket that might receive PHI.

BAAs for every client and every subprocessor

Before your application processes PHI from a covered entity client, both parties must sign a BAA. This is a contract requirement, not a best practice. Operating without a BAA is a violation for both the covered entity and the business associate, regardless of whether any breach occurs.

A BAA defines what the BA may do with PHI, specifies safeguard requirements, requires breach reporting to the covered entity within defined timeframes, and governs data return or destruction at contract termination. Your legal team should have a standard BA template; your sales process should include BAA execution before technical onboarding of a healthcare client.

The chain of BAAs must extend through your subprocessors. If you use AWS, Google Cloud, or Azure to host PHI, obtain their BAA (all three offer one). If you use a logging service, analytics platform, or third-party API that touches PHI, those vendors need BAAs too. A common gap: development teams use monitoring tools under the same account as production without checking whether those tools have signed BAAs for PHI environments.

Compliance documentation dev teams often skip

Three documents are required and routinely absent:

A risk assessment (Security Rule 164.308(a)(1)) requires documented analysis of threats to PHI confidentiality, integrity, and availability, with risk ratings and mitigation plans. It does not have to be elaborate, but it must exist and be updated when your architecture changes. Adding a new data store, a new third-party API, or a new deployment environment should trigger a risk assessment update.

A workforce training log documents that every team member with PHI access received HIPAA training. This applies to engineers, product managers, QA testers with access to production data, and DevOps staff. Training content and dates must be recorded.

A sanctions policy documents what your organization does when a workforce member violates HIPAA policies. It can be brief, but OCR expects to see one.

None of this requires a compliance officer. A single engineer or operations lead can own these documents. The organizations that face the largest enforcement exposure are those with no documentation at all, not those with imperfect documentation.

How PHIGuard helps healthcare dev teams

Development teams building healthcare software face a compliance problem in their own internal workflows: how do you coordinate tasks, track issues, and manage project work without creating PHI exposure in your tooling?

PHIGuard gives healthcare dev teams a task management environment with a signed BAA at every plan level. Bug tracking, feature coordination, and sprint planning involving PHI-sensitive context can happen in a compliant environment rather than defaulting to standard project tools that lack BAAs.

At $20/month for the Practice plan, PHIGuard covers teams up to 10 staff on a flat rate. The Clinic at $49/month covers teams up to 25. No per-seat fees that make compliance tooling expensive as the team grows.

PHIGuard does not replace a security review of your application architecture or a formal compliance program. It addresses the administrative layer of team coordination that most healthcare dev teams handle in non-compliant tools by default.

Manage your practice tasks in one place.

Try PHIGuard free — no credit card required.

Software developers and IT vendors that create, receive, maintain, or transmit PHI on behalf of covered entities are business associates under HIPAA and are directly liable for compliance violations.

Source: HHS.gov — Business Associates

The HIPAA Security Rule requires covered entities and business associates to implement technical safeguards including access controls, audit controls, integrity controls, and transmission security.

Source: HHS.gov — Security Rule

HIPAA Technical Safeguard Requirements for Software Teams
RequirementWhat It Means for DevelopersImplementation Examples
Access controlsLimit PHI access by role and userRBAC, MFA, least-privilege IAM
Audit controlsLog all PHI access and modificationApplication logs, SIEM tools
Integrity controlsPrevent unauthorized PHI alterationChecksums, versioning, change tracking
Transmission securityEncrypt PHI in transitTLS 1.2+, end-to-end encryption
AuthenticationVerify user identity before PHI accessSSO, MFA, session management

Top Software Development Segments by Establishment Count

Segment Establishments
Healthcare SaaS vendors 20,000
Healthcare IT consulting firms 15,000
EHR/EMR development teams 10,000
Custom healthcare app developers 5,000
Total — SOFTDEV 50,000+

Key Compliance Considerations — Software Development

Software developers handling PHI are business associates under HIPAA — not covered entities. The covered entity is the healthcare organization that uses the software. As a BA, developers must: (1) sign BAAs with each covered entity client, (2) implement HIPAA Security Rule technical safeguards, (3) conduct and document risk assessments, (4) train staff who access PHI, and (5) report breaches to covered entity clients within required timeframes. No federal license is required to develop healthcare software, but state-level health IT regulations may apply.

Common Workflows — Software Development

Healthcare software development activity typically spikes in Q1 following healthcare organization budget cycles. Year-end HIPAA compliance deadlines and OCR audit activity can drive urgency for security assessments and documentation updates in Q4. Meaningful use and interoperability rule changes create periodic development sprints across the industry.

Ready to manage your software development practice tasks in one place?

Are software developers subject to HIPAA?
Yes, if they create, receive, maintain, or transmit PHI on behalf of a covered entity. The covered entity is the healthcare organization — the developer is a business associate. Direct liability for HIPAA violations applies to business associates under the Omnibus Rule (2013), meaning OCR can investigate and fine a dev team independently of its healthcare client.
What technical safeguards does HIPAA require for healthcare apps?
The HIPAA Security Rule requires access controls (limit PHI access by user and role), audit controls (log who accessed what and when), integrity controls (prevent unauthorized modification of PHI), transmission security (encrypt PHI in transit, typically TLS 1.2+), and authentication (verify user identity before granting PHI access).
Does a developer need a BAA for every healthcare client?
Yes. Every covered entity client whose PHI you access, store, or process requires a signed BAA before any PHI changes hands. This includes clients who use your software to manage patient data even if your team never sees individual records directly. It also includes subcontractors you use who touch that PHI.
Can dev teams use standard project management tools for healthcare projects?
Only for tasks that don't involve PHI. If tickets, task notes, or documentation reference actual patient names, diagnoses, or other identifiers, those tools become unauthorized PHI handlers. Standard Jira, Asana, Linear, and GitHub Issues do not offer HIPAA BAAs on standard plans. Healthcare project coordination involving PHI requires a compliant tool with a signed BAA.

Keep reading