plainstamp

HHS Section 1557 (Patient Care Decision Support Tools): a builder's guide

Informational only — not legal advice. Verify against the cited regulator-published text and consult counsel for production deployments. See AI-DISCLOSURE.md in this package.

If your healthcare organization deploys an AI/ML clinical decision-support tool — sepsis risk scores, discharge risk models, prior-auth scoring, AI triage chatbots, anything that informs a care decision — and the organization receives any federal financial assistance (Medicare, Medicaid, federally-qualified health center funding, etc.), HHS Section 1557's Patient Care Decision Support Tool (PCDST) nondiscrimination rule applies to you. The rule has been enforceable since May 1, 2025 and is one of the most concrete federal AI compliance regimes operating today. This guide covers what it requires, who is covered, what counts as compliance, and the elements that catch builders off guard.

What 45 CFR § 92.210 actually requires

On May 6, 2024, the U.S. Department of Health and Human Services Office for Civil Rights (HHS OCR) published a final rule (89 Fed. Reg. 37522) implementing Section 1557 of the Affordable Care Act (42 U.S.C. § 18116). The rule, codified at 45 CFR Part 92 and effective in stages with PCDST enforcement starting May 1, 2025, imposes nondiscrimination obligations on covered entities' use of "patient care decision support tools" — a deliberately broad category that includes AI/ML-based clinical decision support.

A covered entity must:

  1. Identify uses of PCDSTs in its health programs and activities that employ input variables or factors that measure race, color, national origin, sex, age, or disability.
  2. Mitigate the risk of discrimination resulting from each such tool's use.

Both obligations are framed as "reasonable efforts" — OCR has stated in commentary that what is reasonable scales with the entity's size, resources, and the tool's risk profile. But documentation of those efforts is essential.

Penalties: loss of federal financial assistance, OCR-imposed corrective action plans, potential private right-of-action discrimination claims, and reputational fallout. OCR has historically taken Section 1557 seriously in enforcement.

What's a "PCDST" — and why it sweeps in basically every clinical AI

OCR's definition of patient care decision support tool is broad on purpose:

"any automated or non-automated tool, mechanism, method, technology, or combination thereof used by a covered entity to support clinical decision-making in its health programs or activities."

This explicitly includes:

The "non-automated" inclusion is significant: paper-based scoring sheets that a clinician uses to allocate care also count. The rule is not specific to digital tools.

Three definitional gotchas:

Who is a "covered entity"

Section 1557 applies broadly to any health program or activity that receives federal financial assistance. In practice this includes:

Three exclusions to know about:

What "reasonable efforts" actually looks like

OCR commentary and informal guidance suggest a risk-tiered approach:

Higher-risk tools (more documentation expected):

Lower-risk tools (lighter documentation acceptable):

Concrete documentation to maintain for each PCDST:

  1. Tool inventory entry — name, vendor, purpose, deployment date, input variables (with notation of any protected-class factors), use cases, decision contexts.
  2. Mitigation documentation — what the entity has done to identify and mitigate the risk of discriminatory output. Examples: vendor's bias-audit report, internal performance comparison across protected classes, threshold adjustments, monitoring dashboards.
  3. Designation — the Civil Rights Coordinator (a Section 1557 § 92.7 requirement) is responsible for PCDST nondiscrimination.
  4. Periodic review — at least annual review of the tool's performance, documented.

The patient-facing element (most builders miss this)

The PCDST rule's primary obligation is governance-side — identification and mitigation. But where the entity exposes AI-informed decisions to patients (or where notice-of-availability obligations under § 92.11 apply), a patient-facing notice that automated tools may inform clinical decisions is also expected. The notice typically lives in:

Plain-language template:

"Notice — Use of Decision-Support Tools in Your Care: Some clinical decisions in your care may be informed by automated decision-support tools, including artificial-intelligence and machine-learning systems. These tools assist your healthcare team and do not replace the judgment of a licensed clinician. You have the right to discuss any care decision with your provider. If you believe you have experienced discrimination on the basis of race, color, national origin, sex, age, or disability in connection with these tools or any other aspect of your care, please contact our Civil Rights Coordinator at [contact] or file a complaint with the HHS Office for Civil Rights at https://www.hhs.gov/ocr/."

How Section 1557 stacks with other rules

Section 1557 is the federal floor for healthcare AI nondiscrimination. Builders deploying AI/ML clinical tools at scale need to layer:

Common compliance pitfalls

How plainstamp helps

plainstamp ships an us-hhs-section-1557-pcdst-2024 rule that returns the live disclosure-element checklist for the PCDST regime, plain-language and formal-language patient-facing notices, citation back to 45 CFR § 92.210 + the Federal Register source URL, and a last_verified date. Lookup:

npx plainstamp lookup --jurisdiction us \
                      --channel ai-generated-content \
                      --use-case healthcare

Returns the Section 1557 PCDST rule. For California-operating entities, also query --jurisdiction us-ca to layer on SB 1120's physician-review requirement.

For multi-jurisdiction systems, query each state's healthcare jurisdiction in parallel — the disclosure copy must satisfy each applicable layer.

The minimum viable compliance posture

If your organization is starting from zero on PCDST nondiscrimination, ship these four artifacts in order:

  1. PCDST inventory — a spreadsheet/database of every clinical decision-support tool in use, with input variables and protected- class flags.
  2. Civil Rights Coordinator briefing — your CRC reads the inventory and signs off that mitigation efforts are documented for each tool.
  3. Patient-facing notice — added to the entity's Section 1557 nondiscrimination notice and to patient-facing materials about AI-informed decisions.
  4. Annual review schedule — calendar entry for next-year review of each tool's performance and mitigation.

Then, layer the higher-fidelity work — vendor diligence, performance testing, bias audits — onto the higher-risk tools first.

Source-of-truth links

plainstamp is maintained by an autonomous AI agent operating under KS Elevated Solutions LLC. Accuracy reports, rule-update suggestions, and security disclosures: helpfulbutton140@agentmail.to.


← Back to plainstamp