Physicians Are Already Using AI. We Make It Safe.

Doctors use ChatGPT, Claude, and Gemini every day — for differential diagnosis, note drafting, patient communication. The problem isn't the AI. It's the PHI going into it. MedScrub fixes that.

The Problem We Saw

Every other clinical AI company builds a proprietary model, locks you in, and processes your raw patient data on their servers. You pay a premium for their model choice, their data handling, their cloud.

We asked a different question: what if you could use any AI model safely?

Consumer LLMs are already the best reasoning engines on the planet — and they're getting better every month. The missing piece isn't the model. It's a way to use them with clinical data without violating HIPAA.

Three Components. One Safe Workspace.

MedScrub isn't another AI scribe. It's infrastructure that makes any consumer LLM safe for clinical work.

PHI Proxy

All 18 HIPAA identifiers are stripped before data reaches any AI model. Names, dates, MRNs, addresses — replaced with deterministic tokens that are 100% reversible on the way back.

  • De-identification before AI processing
  • Re-identification in the final output

Clinical Data Repository

Patient data syncs from your EHR into a local CDR that your practice owns. Labs, vitals, conditions, medications, encounters — structured FHIR data that powers every AI workflow.

  • Your data, your infrastructure
  • Automatic sync from Epic & athenahealth

Any Consumer LLM

You choose the model — OpenAI, Claude, Gemini, Mistral, or local models via Ollama. API keys stay on your machine. As models improve, MedScrub improves with them — no vendor lock-in.

  • API keys stored locally, never sent to MedScrub
  • Switch models anytime — no code changes

EHR → CDR → PHI Proxy → Consumer LLM → Re-identified Output → EHR Write-back

Patient data never reaches an AI model with identifiers attached. The physician sees the final output with all context restored.

Why This Approach Wins

1

Capital efficient

We don’t train models. We don’t run GPU clusters. Consumer LLMs do the heavy lifting — we make them safe. As models get cheaper and better, so does MedScrub.

2

No vendor lock-in

When a better model comes out — and one always does — you switch in seconds. No migration, no retraining, no contract renegotiation.

3

Data gravity compounds

The CDR accumulates structured clinical data over time. Every sync makes SOAP notes more accurate, pre-visit summaries more complete, and prior auth letters more evidence-based.

4

Trust by architecture

PHI de-identification isn’t a policy — it’s a technical guarantee. The proxy strips identifiers deterministically. There’s no way for an AI model to see your patient’s name, even if it tried.

Built by 1PuttHealth

1PuttHealth helps companies build on healthcare interoperability standards — FHIR, EHR integrations, clinical data infrastructure. MedScrub is our flagship product: the physician sidekick we kept seeing the market need.

See It Yourself

If you're a physician spending hours on documentation, or a developer building clinical AI features — we built this for you.