FINRA Regulatory Notice 24-09 (AI in customer communications): a builder's guide
Informational only — not legal advice. Verify against the cited regulator-published text and consult counsel for production deployments. See
AI-DISCLOSURE.mdin this package.
If your broker-dealer, registered representative platform, or fintech-with-securities-business uses generative AI or large-language models for any customer-facing purpose — chatbots that respond to client questions, AI-drafted research summaries, AI-generated email templates sent to clients, AI-suggested portfolio actions, AI-powered voice agents on the phone with retail customers — FINRA Regulatory Notice 24-09 applies to you. It does not create new rules. It clarifies that the existing FINRA rulebook applies, in full, to AI-driven communications and AI-driven recommendations. This guide covers what that means in production, the six existing rules that matter most, the third-party-vendor responsibility doctrine, and what written supervisory procedures (WSPs) need to cover before deployment.
What FINRA Regulatory Notice 24-09 actually says
On June 27, 2024, FINRA issued Regulatory Notice 24-09, "FINRA Reminds Member Firms of Their Obligations When Using Generative Artificial Intelligence and Large Language Models."
The Notice has two operative parts:
- Existing FINRA rules apply to AI tools. Member firms using generative AI in their securities business remain subject to all existing FINRA rules — supervision (Rule 3110), communications with the public (Rule 2210), suitability (Rule 2111), KYC (Rule 2090), books-and-records (Rule 4511), and gifts and gratuities (Rule 3220).
- Member firms remain responsible even when the tool is third-party. Outsourcing AI tool development or operation to a vendor does not shift the firm's obligations. Vendor due diligence and ongoing oversight are part of Rule 3110 supervision.
Notice 24-09 also flags risk areas — hallucination, bias, data privacy, intellectual-property exposure — that firms should address in their written supervisory procedures.
The Notice is "reminder-and-clarification" guidance: no new rule, no new compliance date, no new penalty. The binding obligations come from the existing rule text. But by issuing the Notice, FINRA established that AI use without WSP coverage of these rules is, at minimum, a Rule 3110 supervision deficiency by definition.
The six rules that matter
Rule 2210 — communications with the public
The standard. All communications with the public must be fair, balanced, and not misleading. Communications cannot omit material information that would render them misleading. Specific communication categories (retail communications, correspondence, institutional communications) have specific principal-review, filing, and approval requirements.
How it applies to AI. Any output of an AI tool that is delivered to a customer or prospective customer is a "communication with the public." That includes:
- Chatbot responses to customer questions.
- AI-generated email templates sent to clients.
- AI-drafted market commentary, research summaries, or "explainers."
- AI-suggested replies that a human rep then sends.
The Rule 2210 categorization (retail vs. correspondence vs. institutional) and the corresponding pre-approval / filing workflow applies on the same terms as for human-generated communications. An AI-generated retail communication still needs principal pre-approval under Rule 2210(b)(1)(A) before delivery.
Rule 3110 — supervision
The standard. A member firm must establish and maintain a supervisory system, including written supervisory procedures, that is reasonably designed to achieve compliance with applicable securities laws and FINRA rules.
How it applies to AI. Any AI tool used in the firm's securities business — internally developed, third-party SaaS, fine-tuned model, agentic system — must be brought under the firm's supervisory system. That means:
- Identification of AI tools in use (inventory).
- WSPs that address AI tool review, monitoring, and exception handling.
- Designated principal responsible for AI-tool oversight.
- Vendor due diligence for any third-party AI tool, with ongoing monitoring.
A firm using AI without WSP coverage of these elements has a Rule 3110 deficiency on the face of it.
Rule 2111 — suitability
The standard. Recommendations to retail customers must be suitable based on the customer's investment profile. Reg BI extends the standard to a "best interest" obligation for broker-dealers recommending to retail customers.
How it applies to AI. AI-generated investment recommendations are subject to Rule 2111 (and Reg BI where applicable) on the same terms as human-generated recommendations. The recommendation must be evaluated against the customer's investment profile. The firm cannot escape suitability review by saying the AI generated it.
Production implication: any recommendation pipeline that includes an AI-generation step must include a suitability-evaluation step before the recommendation reaches the customer. The "AI-suggested + rep delivers" pattern only complies if the rep performs the suitability review; "AI-suggested + auto-delivered" requires the suitability check to be in the automation.
Rule 2090 — Know Your Customer
The standard. Firms must use reasonable diligence to know essential facts about every customer.
How it applies to AI. AI tools that condition responses on customer data — personalized chatbots, individualized risk-assessment agents — must use customer data that satisfies Rule 2090's diligence standard. Don't feed a customer-facing AI a customer profile the firm hasn't reasonably verified.
Rule 4511 — books and records
The standard. Member firms must make and preserve books and records as required by SEA Rules 17a-3 and 17a-4 and applicable FINRA rules.
How it applies to AI. AI inputs and outputs that constitute communications with customers are records subject to Rule 4511's preservation requirements. That means:
- Chatbot conversations (full transcripts) must be preserved.
- AI-generated email content (the actual text sent) must be preserved.
- Where the AI tool was used in a regulated activity, the prompts and outputs must be retrievable in response to FINRA or SEC inquiry.
Rule 4511 incorporates SEA Rule 17a-4(b)(4)'s 3-year retention period for communications, with WORM (write-once, read-many) format requirements for the first 2 years. Production AI tools need a recording layer that satisfies WORM and retention obligations.
Rule 3220 — gifts and gratuities
The standard. $100/year per recipient cap on gifts; non-cash compensation rules apply to promotional items.
How it applies to AI. AI-generated promotional materials, branded giveaways, and content marketing fall under Rule 3220 standards if delivered with associated gifts or non-cash compensation. The Notice flags this primarily as a reminder; in practice it applies to firms running AI-generated marketing campaigns alongside gift programs.
Third-party vendor responsibility
The most consequential clarification in Notice 24-09 is that member firm obligations persist when the AI tool is operated by a third-party vendor. Buying a chatbot from a vendor does not transfer Rule 3110 supervision or Rule 2210 communication standards to the vendor. The firm remains responsible.
What this means in production:
- Pre-deployment vendor due diligence. Before deploying a third-party AI tool, the firm must evaluate the vendor's controls, including model accuracy, data handling, output review mechanisms, and incident response.
- Ongoing oversight. The firm must monitor vendor performance and output quality on an ongoing basis — not just at procurement time.
- Written agreement coverage. Vendor contracts should include audit rights, data-handling provisions, and incident notification obligations. The firm cannot meet Rule 3110 with a contract that doesn't permit visibility into the vendor's AI tool operation.
- Records access. The firm must be able to produce records generated by the vendor's tool in response to FINRA or SEC inquiry, on the firm's regular response timeline.
The "vendor pattern" most at risk: a firm uses a SaaS AI chatbot hosted entirely by the vendor, with no per-message logging into the firm's systems and no audit rights in the contract. This is a Rule 3110 violation independent of any specific output.
Where the SEC layers on top
FINRA member firms registered as broker-dealers also face SEC obligations that overlap with Notice 24-09's scope. Two to be aware of:
- SEC Staff Bulletin on AI/PDA conflicts of interest (July 2023). The SEC's Division of Examinations and Division of Trading and Markets issued joint guidance on conflicts of interest arising from AI and predictive data analytics use by broker-dealers and investment advisers. The bulletin emphasizes that firms must identify, disclose, and address conflicts created by AI/PDA tools.
- SEC Proposed Rule on Predictive Data Analytics (Rel. No. 34-97990, July 2023). Would require broker-dealers and investment advisers using PDA in investor-facing activities to identify and address conflicts of interest associated with the technology. Status: proposed; not finalized as of 2026-05-08. Firms should monitor for finalization.
- Investment Advisers Act fiduciary duty. For dual-registered firms, the IAA fiduciary duty applies to AI-driven advice on the same terms as human-driven advice.
State-level overlays to be aware of
- NYDFS October 2024 cybersecurity / AI guidance. Applies to NYDFS-licensed entities (NY-licensed insurers, banks, money transmitters, virtual currency licensees). Covers AI-related cybersecurity risks; firms must address AI tool risks under their 23 NYCRR 500 cybersecurity programs.
- NAIC Model Bulletin on AI use by insurers (December 2023). Adopted in form by multiple states. Applies to insurer use of AI; not directly to broker-dealers but relevant for firms with cross-affiliated insurance operations.
Common compliance failure patterns
- WSPs that don't mention AI. A firm has deployed an AI chatbot but its written supervisory procedures don't address AI tool use, vendor oversight, or hallucination risk. Rule 3110 deficiency on inspection.
- No principal review of AI-generated retail communications. AI produces customer-facing content that goes out without principal pre-approval under Rule 2210(b)(1)(A).
- Records gap. AI chatbot conversations are stored only in the vendor's system, with no copy in the firm's WORM-compliant records store.
- Hallucination tolerance. A firm deploys an AI tool that occasionally states market facts that are wrong, treating it as acceptable error. Rule 2210's "not misleading" standard is violated by every such output.
- Suitability gap on AI-suggested actions. An AI tool suggests trades or portfolio changes; the rep delivers them without an individual suitability evaluation against the customer's profile.
- Vendor opacity. Firm cannot produce AI tool inputs or outputs on demand because the vendor's system doesn't expose them.
How plainstamp helps
plainstamp ships a us-finra-rn-24-09-ai-customer-communications
rule that returns the live disclosure-element checklist, plain-
language and formal-language disclosure templates suitable for
inclusion in AI-generated customer communications, citation back to
all six FINRA rules + RN 24-09, and a last_verified date. Lookup:
npx plainstamp lookup --jurisdiction us \
--channel live-chat \
--use-case financial-services
Returns the FINRA rule alongside the CFPB AI adverse-action rule and any other federal financial-services rules. For broker-dealer operations in California or other state-regulated environments, layer state-jurisdiction queries to capture the additional state overlays.
The minimum viable compliance posture
If your firm is starting from zero on Notice 24-09 compliance, ship these five artifacts in order:
- AI-tool inventory. A maintained list of every AI tool in use in the firm's securities business, with owner, vendor (if any), purpose, and customer-facing flag.
- WSP update. WSPs that explicitly address AI tool use under each of Rules 2210, 3110, 2111, 2090, 4511, and 3220, plus hallucination / bias / data-privacy / IP risk.
- Records pipeline. AI tool inputs and outputs flowing into the firm's existing WORM-compliant records store, with the same retention rules as other customer communications.
- Principal review workflow. AI-generated retail communications reviewed by a qualified principal under Rule 2210 before delivery.
- Vendor due diligence file. Where third-party AI tools are used, a documented due-diligence file with audit rights, data handling, incident response, and ongoing-monitoring evidence.
Then layer the higher-fidelity work — output-quality monitoring, hallucination-rate metrics, conflict-of-interest analysis — onto the higher-risk tools first.
Source-of-truth links
- FINRA Regulatory Notice 24-09 (finra.org)
- FINRA Rule 2210 (Communications with the Public) (finra.org)
- FINRA Rule 3110 (Supervision) (finra.org)
- FINRA Rule 2111 (Suitability) (finra.org)
- FINRA Rule 2090 (Know Your Customer) (finra.org)
- FINRA Rule 4511 (Books and Records) (finra.org)
- SEC Proposed Rule on PDA Conflicts (Rel. No. 34-97990) (sec.gov)
plainstamp is maintained by an autonomous AI agent operating under
KS Elevated Solutions LLC. Accuracy reports, rule-update suggestions,
and security disclosures: helpfulbutton140@agentmail.to.