HIPAA Compliance for AI-Built Apps
Healthtech apps built with Lovable, Bolt.new, or v0 almost always fail a basic HIPAA review. Typical gaps: PHI stored in browser localStorage, no audit log on patient record access, and Supabase RLS disabled. Afterbuild Labs delivers a HIPAA-aware audit and fix in 5–10 days from $499.
By Hyder ShahFounder · Afterbuild LabsLast updated 2026-04-18
Why AI-built apps fail HIPAA reviews
HIPAA compliant AI built apps share one property: the founder stopped using the AI generator as the final source of truth before any real patient data moved through the app. Every healthtech app we have rescued from Lovable, Bolt.new, or v0 started in the same place. The generator produced a clean-looking patient intake form, a visit scheduler, and a Supabase schema with a patients table. The founder demoed it to a clinic, the clinic asked for HIPAA posture, and the review came back with the same seven or eight findings every time.
The first finding is always hosting. Supabase Free and Vercel Hobby do not sign a Business Associate Agreement, and the app has been handling PHI on infrastructure that cannot legally hold it. The AI generator never prompts you to upgrade because HIPAA posture is not in its training objective. A healthtech HIPAA audit begins with the legal migration: Supabase Team plan, Vercel Enterprise, BAAs executed with both, and a survey of every third-party vendor in the PHI path to either get a BAA or remove the vendor from the path.
The second finding is the audit trail. HIPAA §164.312(b) requires a record of every access to electronic protected health information. AI-generated scaffolds have no such table and no middleware that writes to one. We install a hash-chained audit log with middleware on every authenticated PHI endpoint, a nightly chain verifier, and a retention job that honors the six-year window. The security hardening expert engagement installs this pattern end-to-end.
The 8 PHI-handling failures we see in Lovable / Bolt / v0
A PHI handling AI app audit converges on the same eight findings regardless of which AI builder produced the app. The order below is how we ship the remediation.
- PHI in browser localStorage. AI generators cache patient records client-side for snappy UX. Any device compromise leaks every cached patient row. We remove client-side caching for PHI-bearing queries and force fresh server fetches with short-lived session tokens.
- PHI in URL query parameters. Patient IDs in the URL end up in browser history, referer headers, and every analytics pipeline on the page. We move identifiers to path segments that are never logged or switch to opaque session tokens.
- Supabase RLS disabled on patient tables. The anon key can read any row in
patients,visits, andmessages. A hipaa supabase configuration enables RLS on every PHI-bearing table, writes per-patient policies tied to the authenticated provider, and adds a CI test that fails the build on an unprotected PHI table. - No audit log on patient record access. A hipaa audit log ai app installs a hash-chained append-only table and middleware on every authenticated PHI endpoint. Every read and write is logged with actor, subject, timestamp, and request metadata.
- PHI in Sentry, PostHog, or Google Analytics. Autocapture ships patient names and diagnoses to third-party analytics vendors with no BAA. We install a redaction layer, disable autocapture on PHI routes, and move any PHI-touching code path to server actions.
- No MFA on provider accounts. A hipaa authorization developer adds TOTP MFA, session rotation on sign-in and privilege elevation, and a device-binding step for sensitive actions.
- Hard DELETE on patient rows. The AI generator wires a hard delete. HIPAA demands retention; we swap to soft-delete with tombstones, a scheduled purge job, and documentation of the retention schedule.
- No encryption-at-rest or weak key management. A hipaa secure hosting ai app upgrades to managed Postgres with disk encryption, moves keys into a dedicated key management service, and documents the rotation schedule.
HIPAA-aware Supabase: RLS, audit logs, encryption
A hipaa supabase configuration starts on the Team plan or higher. The Pro plan does not sign a BAA and does not expose the full audit logging features the framework expects. On the Team plan, the first step is to enable at-rest encryption with a customer key, turn on the Postgres audit extension, and raise log retention to at least 90 days. The database optimization expert engagement handles the database-layer posture end-to-end.
RLS is the single highest-impact control. Every PHI-bearing table gets a policy that checks auth.uid() against a provider-to-patient assignment table, and every admin query runs through a server-side service role with its own row of the audit log. The CI pipeline includes a test that enumerates tables and fails if any table reachable from a PHI join graph lacks RLS. This single test catches more future regressions than any other control in the stack.
Audit logging at the database level is necessary but not sufficient. A hipaa audit log ai app also needs application-level middleware that captures the business-level event — "provider X viewed patient Y at time T for reason R" — because raw SQL logs cannot answer that question. We install the middleware, the hash-chained table, and the nightly verifier together.
BAAs and vendor review
A baa-ready ai app has a BAA executed with every vendor in the PHI path. The review starts by enumerating that path: the frontend host, the backend host, the database, the email provider, the SMS provider, the analytics vendor, the error tracker, any AI model provider, any video vendor. Each one either signs a BAA or is removed from the path. Most AI-built healthtech apps start with five or six vendors that do not sign a BAA; we finish with one or two and a documented legal posture for each.
The common traps: OpenAI signs a BAA on its Enterprise API tier, not the standard API. Anthropic signs a BAA on a per-customer basis. Resend and Postmark sign BAAs on enterprise plans. Twilio signs for voice and SMS but only with explicit configuration. Sentry signs on its Business tier. If the AI generator wired any of these on their free or standard tiers, you either upgrade or replace them. We handle both paths in a healthtech AI app rescue engagement.
Patient-facing auth patterns for HIPAA
A hipaa telemedicine app or patient portal demands stronger auth than a consumer app. Provider accounts require TOTP MFA, session rotation on sign-in and on privilege elevation, and a short idle timeout. Patient accounts require at minimum email-based magic links with short TTLs, forced password reset on first login if a password is used, and lockout after repeated failed attempts. The auth specialist engagement ships all of this in a single week.
Session management is where AI-generated code most often fails. The generator wires a long-lived JWT in localStorage and never rotates it. For HIPAA, the session token must rotate on every privilege change and on a configurable idle window, and it must never live in a place a cross-site script could read. We move sessions to httpOnly cookies, add a rotation middleware, and instrument every auth event into the audit log.
E-PHI access logs, retention, deletion
Retention rules are often the most surprising part of a HIPAA rescue. A patient cannot ask to have their clinical record deleted. They can ask for it to be amended, and they can ask for an accounting of disclosures. The AI-generated scaffold ships a hard DELETE on the patient row because the generator learned from consumer apps where hard delete is the right default. We swap to a soft-delete pattern: a tombstone flag, a retention timer keyed on matter close, and a scheduled purge job that runs after the retention window expires.
The accounting-of-disclosures requirement is best handled by the same audit log that satisfies §164.312(b). A query against the audit table by patient ID returns every disclosure to every provider, admin, or integration, which is exactly what the regulation asks for. We ship the query as a canned report the compliance officer can run on demand. Related reading: our healthtech Cursor case study documents the full playbook.
Our HIPAA audit and fix playbook (4 phases)
A healthtech hipaa audit runs four phases over 5–10 business days. Every phase has a written deliverable and a Loom walkthrough.
- Discovery (day 1–2). Enumerate every PHI-bearing table, every endpoint that touches PHI, every vendor in the path, and every existing control. Output: an inventory document and a risk-ranked findings list.
- Legal posture (day 2–3). Upgrade Supabase and Vercel to BAA-eligible plans, execute BAAs with every vendor in the path, and replace any vendor that does not sign. Output: a BAA register.
- Technical remediation (day 3–8). Enable RLS on every PHI table, install the audit log middleware and hash-chained table, scrub PHI from analytics and error tracking, add MFA and session rotation, swap hard delete for soft delete with retention. Output: a merged remediation branch and a Playwright test suite.
- Documentation and handoff (day 8–10). Write the runbook, the incident response plan, and the policies a compliance officer or auditor will ask to see. Output: a documentation bundle and a Loom walkthrough.
When you need a HIPAA compliance consultant versus a developer
A HIPAA compliance consultant writes the Security Risk Assessment, the policies, and the training program. A HIPAA-aware developer writes the code that implements the controls those documents describe. You need both, but in different phases. Before a pilot with a clinic, the developer work dominates: RLS, audit logs, BAAs, encryption, MFA. Before a SOC 2 audit overlay or a larger health system contract, the consultant work dominates: the written risk assessment, the policies, the annual review cycle.
Afterbuild Labs is the developer half. We do not write the Security Risk Assessment and we do not sign attestations. What we do is make sure that when a consultant or auditor walks in, the code and the infrastructure already satisfy the controls they are measuring. Pair us with a HIPAA consultant for the full posture. Related services: security audit, emergency triage. Platform-specific rescues: Lovable developer, Bolt developer. Case studies: Cursor healthtech rescue, fintech Lovable rescue.
DIY vs Afterbuild Labs vs HIPAA consultant
A comparison of the three common paths to HIPAA posture for an AI-built healthtech app.
| Dimension | DIY with AI builder | Afterbuild Labs rescue | Full HIPAA consultant |
|---|---|---|---|
| Time to first pilot | Unknown — review blocks launch | 5–10 days | 6–12 weeks |
| RLS and audit log | Not written | Shipped and CI-tested | Specified, not written |
| BAAs executed | None | Every vendor in the PHI path | Vendor list produced |
| Security Risk Assessment | Not produced | Not included — pair with consultant | Produced |
| Code changes made | Up to the founder | Every Critical finding merged | None, specification only |
| Fixed price | N/A | $499 audit, $3,999+ fix | $15,000–$45,000 |
| Auditor handoff | Not ready | Code ready, consultant signs | Full posture |
HIPAA compliant AI built apps — FAQ
Is Supabase HIPAA-ready for an AI-built app?
Supabase can support HIPAA, but not on the free or Pro tier. You need the Team or Enterprise plan, which signs a Business Associate Agreement and enables the security features a HIPAA-compliant AI built app requires: at-rest encryption on the Postgres disk, full audit logging on the database, private networking, and extended log retention. The AI generator will not prompt you to upgrade, and most healthtech Lovable rescues begin with the realization that PHI has been handled on a plan that cannot legally hold it. We migrate the project to the correct plan, execute the BAA, and rewrite RLS so every PHI-bearing table has a policy tied to auth.uid().
Do Vercel and Supabase sign a BAA for healthtech apps?
Vercel signs a BAA on its Enterprise plan, not on Pro. Supabase signs on its Team plan and above. Most AI-built healthtech apps ship on Vercel Hobby and Supabase Free, neither of which signs a BAA. Before any real PHI moves through the app, the legal posture has to change: upgrade to Vercel Enterprise and Supabase Team, execute both BAAs, then rescope the infrastructure to keep PHI inside the covered-entity boundary. Third-party vendors in the PHI path — OpenAI, Anthropic, Resend, Twilio, Sentry — each need their own BAA or they must be removed from the PHI path entirely.
What are the HIPAA audit log requirements for an AI-built app?
HIPAA §164.312(b) requires an audit control that records and examines activity in systems containing ePHI. In practice this means an append-only log that captures who accessed which patient record, when, from where, and what they did. For an AI built app, this almost always means installing a hash-chained audit table in Postgres, writing middleware that intercepts every authenticated request to a PHI-bearing endpoint, and a nightly verifier that the chain has not been tampered with. Retention is typically six years. AI generators write neither the table nor the middleware, so every HIPAA audit log AI app rescue installs the full pattern before launch.
What counts as PHI versus PII in an AI-built healthtech app?
PHI is any health information that can be tied to an individual: name plus diagnosis, email plus appointment history, IP address plus clinic visit. PII is broader and less strictly regulated. The practical rule inside an AI-built healthtech app is that almost every row in a patient-facing database is PHI the moment it joins to a medical record. Email addresses in a newsletter table are PII; the same email address on a row linked to a prescription is PHI. We treat every table that can be joined to a clinical row as PHI-bearing and apply the RLS, encryption, and audit log patterns uniformly.
What does HIPAA require for telemedicine video in an AI-built app?
A HIPAA telemedicine app must use a video vendor that signs a BAA: Zoom for Healthcare, Doxy.me, Twilio Video with the BAA in place, or similar. Standard consumer Zoom, Google Meet, and FaceTime do not qualify. The AI-generated scaffold almost always wires the standard consumer SDK. A rescue swaps the SDK for the HIPAA-eligible one, executes the BAA, and removes any logging of session metadata to observability tools that do not themselves sign a BAA. Session recording is optional but if enabled must be stored encrypted in a BAA-covered bucket.
How long do I have to retain PHI, and what are the deletion rules?
Federal HIPAA requires six years of retention for the records subject to the Privacy Rule, and state law frequently extends this to ten years for clinical records. Patients have a right to request amendment, not deletion, of clinical records. Your AI-built app must support soft-delete with a tombstone and a retention timer rather than hard delete, and the patient-facing delete flow must surface what the app can and cannot erase. The AI generator ships a hard DELETE by default, which is both legally wrong and operationally destructive. A rescue swaps to soft-delete with a scheduled purge job and documents the retention schedule.
What are the HIPAA breach reporting rules for an AI-built app?
A breach of unsecured PHI affecting 500 or more individuals triggers notification to HHS, the affected individuals, and the media within 60 days. Smaller breaches are logged and reported annually. Encryption-at-rest with a compliant key management process provides a safe harbor that converts many incidents from reportable breaches into non-reportable events. An AI-built app without encryption-at-rest and without an incident runbook has neither the safe harbor nor the ability to respond. A rescue installs the safe harbor and writes the runbook in the same engagement.
What is included in the Afterbuild Labs HIPAA audit scope?
The $499 audit covers eight findings typical of an AI-built healthtech app: PHI in client storage or URL parameters, Supabase RLS on patient-bearing tables, audit log completeness, encryption-at-rest configuration, BAA posture with every vendor in the PHI path, analytics and observability PII leakage, patient-facing auth with MFA and session rotation, and telemedicine or messaging vendor eligibility. Deliverable is a PDF with each finding ranked by severity, a Loom walkthrough for each Critical, and a fixed-price remediation quote. Remediation typically runs $3,999–$9,999 depending on codebase size.
Ship HIPAA compliance for your AI-built app
Send the repo and a line about the clinic or health system waiting on your HIPAA posture. In 48 hours we return a written HIPAA compliant AI built apps audit and a fixed-price path to close every finding. $499 audit, $3,999 remediation, no hourly billing.