New York has moved to tighten rules governing artificial intelligence in consumer-facing media and to extend protections over the commercial use of deceased persons’ likenesses, measures likely to affect national brands and franchise networks that advertise to or operate in the state. According to the governor's office, the AI Transparency in Advertising Act requires clear notice when commercial advertising distributed in New York relies on AI-generated “synthetic performers”, with the statute taking effect on 9 June 2026 and civil penalties of $1,000 for a first breach and $5,000 for repeat violations. (Sources: governor's office, legal briefings).
The new disclosure obligation reaches a wide set of market actors, from national advertisers and creative agencies to local franchisees and influencers producing regionally targeted posts. Industry advisories note the law covers digitally created humanlike figures made or substantially altered by software to appear as people performing in commercials, and that exemptions are limited, applying for example to audio-only messaging and certain expressive works. Businesses that centralise marketing or allow franchisees to localise brand creative will need to ensure disclosures are incorporated wherever an ad is likely to be seen by New York consumers. (Sources: legal analyses, firm advisories).
Franchise systems face particular operational questions: who must flag synthetic-performer use in contracts, how approval workflows will detect AI-generated assets, and which party bears indemnity if a disclosure is omitted. Practitioners recommend updating agency and vendor agreements to require advance notice of synthetic content, adding review steps to ad-approval processes, and clarifying liability allocation between franchisor and franchisee where localised creative is used. Industry counsel argue these steps are prudent ahead of the law’s effective date to avoid regulatory and brand risk. (Sources: law firm guidance, professional commentary).
Separately, New York expanded its right of publicity to address posthumous digital recreations. Effective immediately upon enactment, the Posthumous Right of Publicity Expansion Act demands prior authorisation from an heir, executor or assignee before a deceased person’s name, voice, image or AI-generated likeness can be used commercially. The amendment builds on earlier provisions dealing with digital replicas and aims to close gaps exposed by generative technologies that can synthesise performances or likenesses. (Sources: legal summaries, firm publications).
The practical consequences for brands and franchisors are concrete: licences that historically covered “all media” or legacy footage may not be sufficient for AI-based resurrections, de‑aging or derivative simulations. Advisers urge companies to conduct rights audits, negotiate explicit estate licences addressing AI reproduction, and amend franchise disclosure documents and system policies to assign responsibility for securing publicity rights and to prohibit unauthorised local use. Failure to do so could expose parties to statutory damages, awards of profits, and potential punitive relief. (Sources: legal analyses, firm notes).
Taken together, the statutes reflect a wider state effort to regulate AI beyond technical standards, treating certain commercial AI uses as consumer-protection and publicity issues. Alongside the disclosure and publicity statutes, New York has established oversight structures to study frontier models and their social and labour effects, signalling that the state intends ongoing engagement with AI governance. For franchisors and franchisees, the shift means treating some AI activities as regulated practices that require documented oversight, bespoke contract language and updated compliance programmes. (Sources: governor's announcement, professional commentaries).
For businesses operating in multi-state systems, the new New York rules underline the need to harmonise national marketing strategies with state-level mandates: centralised creative may need state‑specific disclosures or regional controls, and local operators should be restricted from deploying AI-generated performers or deceased-person likenesses without documented approvals. Legal advisers suggest integrating synthetic‑performer checks into approval workflows, adding AI-specific clauses to FDDs and vendor contracts, and updating training and monitoring to preserve both compliance and consumer trust as AI becomes more embedded in advertising. (Sources: firm advisories, legal newsletters).
Source Reference Map
Inspired by headline at: [1]
Sources by paragraph:
- Paragraph 1: [2], [4]
- Paragraph 2: [4], [3]
- Paragraph 3: [6], [4]
- Paragraph 4: [3], [5]
- Paragraph 5: [6], [5]
- Paragraph 6: [2], [6]
- Paragraph 7: [4], [7]
Source: Noah Wire Services