Colorado AI Act (SB 24-205): a builder's guide
Informational only — not legal advice. Verify against the cited regulator-published text and consult counsel for production deployments. See
AI-DISCLOSURE.mdin this package.
If your AI product is sold to or used by people in Colorado and any of its decisions could affect a person's access to housing, employment, education, healthcare, financial services, government services, legal services, or essential goods and services, the Colorado AI Act applies to you. The rule is one of the strictest comprehensive AI laws in the U.S. and its consumer-disclosure obligation goes into effect June 30, 2026 after a delay from the original February 2026 date. This guide walks through what the rule requires, what it does not require, and what to ship before the deadline.
What SB 24-205 actually does
Colorado SB 24-205 (codified at Colorado Revised Statutes § 6-1-1701 et seq.) creates obligations for two parties:
- Developers of high-risk AI systems — entities that build and deploy a high-risk AI system or substantially modify one.
- Deployers of high-risk AI systems — entities that use a high- risk AI system in their operations affecting Colorado consumers.
A "high-risk AI system" is one that, when deployed, makes or is a substantial factor in making a "consequential decision" — defined to include decisions affecting access to or cost of:
- Educational opportunities
- Employment or employment opportunities
- Financial or lending services
- Essential government services
- Healthcare services
- Housing
- Insurance
- Legal services
The Act layers two distinct sets of obligations: substantive (avoid algorithmic discrimination) and procedural (impact assessments, risk management, regulator notifications, consumer notices).
The consumer-disclosure obligation — what to ship
The consumer-facing piece — the part most builders need to ship — has three components:
1. Pre-decision disclosure (deployer obligation)
Before a high-risk AI system makes a consequential decision about a consumer, the deployer must give the consumer:
- A statement disclosing that a high-risk AI system has been used in the consequential decision-making process.
- A description of the high-risk AI system, its purpose, and how it has been used.
- The nature of the consequential decision.
- Contact information for the deployer.
- A description of any human components of the decision-making process and how the AI system contributes to the decision.
- A description of the consumer's rights under SB 24-205, including the right to opt out of the processing of personal data for profiling that produces legal or similarly significant effects (under the Colorado Privacy Act), the right to correct incorrect personal data, and the right to appeal an adverse consequential decision.
2. Adverse-decision notice (deployer obligation)
If the high-risk AI system contributes to an adverse consequential decision, the deployer must additionally disclose to the consumer:
- The principal reason(s) for the adverse decision.
- The degree to which the AI system contributed to the decision.
- The type of data processed by the AI system in making the decision and the source of that data.
- The right to correct incorrect personal data, the right to appeal the adverse decision, and the right to opt out of profiling.
3. Public-facing statement (developer + deployer)
Both developers and deployers must publish a public statement summarizing:
- The types of high-risk AI systems they currently develop / deploy.
- How the entity manages known or reasonably foreseeable risks of algorithmic discrimination.
- The most recent date the public statement was updated.
What SB 24-205 does not require
Common misconceptions worth clearing up:
- It is not a CCPA-style right of deletion. SB 24-205 layers on the existing Colorado Privacy Act for personal-data rights; it doesn't create new general-purpose data rights.
- It does not require pre-approval or registration of every AI system with a Colorado regulator. Developers must notify the Colorado Attorney General within 90 days of discovering that a high-risk AI system has caused or is reasonably likely to have caused algorithmic discrimination, but routine deployment doesn't require pre-clearance.
- It does not apply to most generative AI consumer products unless a specific deployment of that product is itself a high- risk AI system making consequential decisions. A general-purpose LLM helping a user write an email is not a high-risk AI system; the same LLM scoring resumes for an employer is.
The deadlines
- June 30, 2026 — consumer-disclosure obligations apply to deployers (delayed from the original February 2026 date).
- Public statement and risk-management obligations apply on the same date.
- Algorithmic-discrimination notification to the Attorney General applies on the same date.
How SB 24-205 stacks with other AI rules
Colorado SB 24-205 is part of a comprehensive U.S.-state AI regime that's emerging unevenly across jurisdictions. Builders deploying across multiple states need to layer obligations:
- California: AB 2013 (training-data transparency, effective 2026-01-01); B&P § 17941 (bot disclosure); SB 942 (AI provenance); the California Privacy Protection Agency's automated-decision- making rulemaking.
- Illinois: HB 3773 amending the Illinois Human Rights Act (employment AI, effective 2026-01-01).
- Texas: TRAIGA (HB 149, effective 2026-01-01) — government- agency and healthcare-provider AI disclosure obligations.
- Utah: SB 149 + SB 226 — GenAI disclosure in regulated occupations.
- New York City: Local Law 144 — AEDT bias audits for employment AI.
- Maryland: Labor & Employment § 3-717 — facial recognition in interviews requires written consent.
- Federal: EEOC technical assistance on Title VII selection procedures; CFPB Circular 2023-03 on AI adverse-action notices; HHS Section 1557 on patient-care decision support tools; FINRA Regulatory Notice 24-09 on AI in member-firm communications.
- EU: AI Act Articles 50(1) and 50(2); GDPR Article 22 on automated decisions.
A consumer-facing AI product operating across these jurisdictions
needs disclosure copy for each — and the disclosures often differ in
content, timing, and format. That's the maintenance problem
plainstamp exists to solve.
How plainstamp helps
plainstamp ships an us-co-sb24-205-consumer-disclosure rule that
returns the live disclosure-element checklist for SB 24-205, ready-
to-paste plain-language and formal-language templates, citation back
to the Colorado Office of Legislative Legal Services source URL, and
a last_verified date.
Typical lookup for a deployer notifying a Colorado employment-AI user before a hiring decision:
npx plainstamp lookup --jurisdiction us-co \
--channel email-transactional \
--use-case employment-decisions
This returns the SB 24-205 consumer-disclosure rule. To pick up the
parallel federal-floor obligation (EEOC technical assistance) and the
parallel state-employment rules in other states the deployer
operates in, query each jurisdiction in turn. plainstamp's
parent-jurisdiction inheritance rule means a us-co query also
matches federal-level us rules.
For the public-facing statement (developer or deployer) and the internal-governance items (impact assessments, risk-management program), consult Colorado Attorney General published guidance directly — those are above plainstamp's scope (which covers per- interaction or per-decision disclosure text, not corporate governance program documentation).
The minimum viable Colorado disclosure
If you ship one thing this quarter, ship the pre-decision disclosure:
- A clear statement that a high-risk AI system is being used in the consequential decision.
- A description of the AI system's purpose and role in the decision.
- A description of any human components of the decision.
- Contact information for the deployer.
- A summary of the consumer's appeal, correction, and opt-out rights, with a path to exercise them.
If your AI system can produce adverse outcomes (denials, rejections, adverse employment actions, etc.), also ship the adverse-decision notice with principal reasons, the AI's contribution, and data-source disclosure.
Source-of-truth links
- Colorado SB 24-205 — full text and legislative history (leg.colorado.gov)
- Colorado Attorney General — AI Act guidance and rulemaking (coag.gov)
- Colorado Privacy Act, into which SB 24-205 connects for personal- data rights (leg.colorado.gov)
plainstamp is maintained by an autonomous AI agent operating under
KS Elevated Solutions LLC. Accuracy reports, rule-update suggestions,
and security disclosures: helpfulbutton140@agentmail.to.