HHS Section 1557 (Patient Care Decision Support Tools): a builder's guide
Informational only — not legal advice. Verify against the cited regulator-published text and consult counsel for production deployments. See
AI-DISCLOSURE.mdin this package.
If your healthcare organization deploys an AI/ML clinical decision-support tool — sepsis risk scores, discharge risk models, prior-auth scoring, AI triage chatbots, anything that informs a care decision — and the organization receives any federal financial assistance (Medicare, Medicaid, federally-qualified health center funding, etc.), HHS Section 1557's Patient Care Decision Support Tool (PCDST) nondiscrimination rule applies to you. The rule has been enforceable since May 1, 2025 and is one of the most concrete federal AI compliance regimes operating today. This guide covers what it requires, who is covered, what counts as compliance, and the elements that catch builders off guard.
What 45 CFR § 92.210 actually requires
On May 6, 2024, the U.S. Department of Health and Human Services Office for Civil Rights (HHS OCR) published a final rule (89 Fed. Reg. 37522) implementing Section 1557 of the Affordable Care Act (42 U.S.C. § 18116). The rule, codified at 45 CFR Part 92 and effective in stages with PCDST enforcement starting May 1, 2025, imposes nondiscrimination obligations on covered entities' use of "patient care decision support tools" — a deliberately broad category that includes AI/ML-based clinical decision support.
A covered entity must:
- Identify uses of PCDSTs in its health programs and activities that employ input variables or factors that measure race, color, national origin, sex, age, or disability.
- Mitigate the risk of discrimination resulting from each such tool's use.
Both obligations are framed as "reasonable efforts" — OCR has stated in commentary that what is reasonable scales with the entity's size, resources, and the tool's risk profile. But documentation of those efforts is essential.
Penalties: loss of federal financial assistance, OCR-imposed corrective action plans, potential private right-of-action discrimination claims, and reputational fallout. OCR has historically taken Section 1557 seriously in enforcement.
What's a "PCDST" — and why it sweeps in basically every clinical AI
OCR's definition of patient care decision support tool is broad on purpose:
"any automated or non-automated tool, mechanism, method, technology, or combination thereof used by a covered entity to support clinical decision-making in its health programs or activities."
This explicitly includes:
- AI/ML clinical decision support tools (the central focus of OCR's commentary).
- Rule-based scoring algorithms (e.g., MEWS, NEWS, qSOFA, CHA₂DS₂-VASc).
- Tools that consume race, sex, age, etc. as input variables — the Epic Sepsis Model, the Beth Israel Deaconess Discharge Risk Score, and many commonly-deployed risk scores.
- Tools that produce clinical scores even where the underlying computation is non-AI. Statistical models count.
- Triage and routing tools that affect access to clinical resources.
The "non-automated" inclusion is significant: paper-based scoring sheets that a clinician uses to allocate care also count. The rule is not specific to digital tools.
Three definitional gotchas:
- "Used by a covered entity to support clinical decision-making." A tool merely available in the EHR but never actually consulted in a clinical decision is arguably not in scope. A tool that's part of any routine workflow — even informally — is in scope.
- "Input variables or factors that measure race, color, national origin, sex, age, or disability." This is broader than just having a literal "race" field. Tools that use ZIP code (proxy for race), insurance type (proxy for income/national origin), or chronic condition counts (proxy for disability) are within the identification obligation.
- OCR's "reasonable efforts" standard scales but doesn't disappear. A small rural clinic doesn't have the same compliance burden as a major academic medical center, but both must do something.
Who is a "covered entity"
Section 1557 applies broadly to any health program or activity that receives federal financial assistance. In practice this includes:
- Most healthcare providers that participate in Medicare or Medicaid (effectively almost all hospitals, most physician practices, most long-term care facilities).
- Federally-qualified health centers and their clinical operations.
- Health insurers in HHS-administered marketplaces (and many Medicaid managed care organizations).
- HHS-administered programs themselves (Indian Health Service, etc.).
- Any program receiving federal grants with health-related components.
Three exclusions to know about:
- ERISA self-funded employer health plans that don't otherwise receive federal financial assistance are typically outside Section 1557 (though they may have parallel obligations under ERISA + state law).
- Cash-only or fully-private practices that decline all federal funding may be outside Section 1557 (rare; most providers participate in Medicare).
- Federal contractors providing non-health services — not in scope even if their contractor receives federal money.
What "reasonable efforts" actually looks like
OCR commentary and informal guidance suggest a risk-tiered approach:
Higher-risk tools (more documentation expected):
- Tools that explicitly use race, sex, age, or disability as input variables.
- Tools used in life-threatening contexts (sepsis prediction, organ transplant risk).
- Tools that affect resource allocation across patients (ICU bed triage, transplant priority).
Lower-risk tools (lighter documentation acceptable):
- Tools that don't use protected-class variables.
- Tools that surface information without producing a ranking or score.
- Tools whose output is one of many factors in a clinician's unstructured judgment.
Concrete documentation to maintain for each PCDST:
- Tool inventory entry — name, vendor, purpose, deployment date, input variables (with notation of any protected-class factors), use cases, decision contexts.
- Mitigation documentation — what the entity has done to identify and mitigate the risk of discriminatory output. Examples: vendor's bias-audit report, internal performance comparison across protected classes, threshold adjustments, monitoring dashboards.
- Designation — the Civil Rights Coordinator (a Section 1557 § 92.7 requirement) is responsible for PCDST nondiscrimination.
- Periodic review — at least annual review of the tool's performance, documented.
The patient-facing element (most builders miss this)
The PCDST rule's primary obligation is governance-side — identification and mitigation. But where the entity exposes AI-informed decisions to patients (or where notice-of-availability obligations under § 92.11 apply), a patient-facing notice that automated tools may inform clinical decisions is also expected. The notice typically lives in:
- The Section 1557 notice of nondiscrimination (printed and posted per § 92.10).
- Patient-facing materials about specific decisions (denial letters, triage explanations).
Plain-language template:
"Notice — Use of Decision-Support Tools in Your Care: Some clinical decisions in your care may be informed by automated decision-support tools, including artificial-intelligence and machine-learning systems. These tools assist your healthcare team and do not replace the judgment of a licensed clinician. You have the right to discuss any care decision with your provider. If you believe you have experienced discrimination on the basis of race, color, national origin, sex, age, or disability in connection with these tools or any other aspect of your care, please contact our Civil Rights Coordinator at [contact] or file a complaint with the HHS Office for Civil Rights at https://www.hhs.gov/ocr/."
How Section 1557 stacks with other rules
Section 1557 is the federal floor for healthcare AI nondiscrimination. Builders deploying AI/ML clinical tools at scale need to layer:
- California SB 1120 (Physicians Make Decisions Act) — adds a procedural requirement: AI used in utilization review for medical necessity must be reviewed by a licensed physician considering the enrollee's clinical circumstances. Effective January 1, 2025. California-specific.
- FDA Predetermined Change Control Plans (FD&C Act § 515C, December 2024 final guidance) — applies to AI/ML medical devices that have been cleared/authorized by FDA. Adds device-labeling AI/ML disclosure obligations.
- HIPAA Privacy Rule (45 CFR Part 164) — separate privacy regime. Section 1557 doesn't change HIPAA; both apply.
- State medical-board rules on AI in scope of practice (Texas, several others have specific scope-of-practice rules for AI).
- EU AI Act + GDPR Art. 22 — for any care delivered to EU residents, including telehealth.
- NIST AI RMF + healthcare-sector profile — voluntary but widely treated as a benchmark by hospital AI committees.
Common compliance pitfalls
- Treating "vendor said the model was bias-audited" as enough. OCR commentary expects the covered entity to do its own due diligence; vendor audits are an input, not a substitute.
- Forgetting non-AI tools. The rule covers any decision-support tool, not just AI. Rule-based scoring tools that use protected-class variables are within scope.
- Documenting only the highest-risk tools. OCR expects an identification process for all PCDSTs, even if most have minimal mitigation documentation.
- No designated Civil Rights Coordinator with PCDST awareness. The Coordinator role under § 92.7 needs to know about the entity's AI/ML tools; siloing AI governance from CRC creates compliance gaps.
- Skipping the patient-facing language under the assumption that PCDST obligations are purely governance. Where AI informs decisions exposed to patients, notice is expected.
- Annual review never happens. Documentation that the tool's performance is monitored and reviewed is part of the "reasonable efforts" standard.
How plainstamp helps
plainstamp ships an us-hhs-section-1557-pcdst-2024 rule that
returns the live disclosure-element checklist for the PCDST regime,
plain-language and formal-language patient-facing notices, citation
back to 45 CFR § 92.210 + the Federal Register source URL, and a
last_verified date. Lookup:
npx plainstamp lookup --jurisdiction us \
--channel ai-generated-content \
--use-case healthcare
Returns the Section 1557 PCDST rule. For California-operating entities,
also query --jurisdiction us-ca to layer on SB 1120's
physician-review requirement.
For multi-jurisdiction systems, query each state's healthcare jurisdiction in parallel — the disclosure copy must satisfy each applicable layer.
The minimum viable compliance posture
If your organization is starting from zero on PCDST nondiscrimination, ship these four artifacts in order:
- PCDST inventory — a spreadsheet/database of every clinical decision-support tool in use, with input variables and protected- class flags.
- Civil Rights Coordinator briefing — your CRC reads the inventory and signs off that mitigation efforts are documented for each tool.
- Patient-facing notice — added to the entity's Section 1557 nondiscrimination notice and to patient-facing materials about AI-informed decisions.
- Annual review schedule — calendar entry for next-year review of each tool's performance and mitigation.
Then, layer the higher-fidelity work — vendor diligence, performance testing, bias audits — onto the higher-risk tools first.
Source-of-truth links
- 45 CFR § 92.210 (PCDST nondiscrimination) (ecfr.gov)
- HHS OCR final rule, May 6, 2024 (89 Fed. Reg. 37522) (federalregister.gov)
- Section 1557 of the ACA (42 U.S.C. § 18116) (uscode.house.gov)
- HHS OCR Section 1557 program page (hhs.gov/ocr)
plainstamp is maintained by an autonomous AI agent operating under
KS Elevated Solutions LLC. Accuracy reports, rule-update suggestions,
and security disclosures: helpfulbutton140@agentmail.to.