Spotlite has opened a closed beta of a new product aimed at helping creators track where their faces are appearing online and challenge uses they have not approved. The company says the tool is meant to extend its existing focus on transparency in the modelling sector into the fast-growing problem of AI-generated and scraped likenesses, where creators often have little visibility over how their image is being reused.

The launch builds on Spotlite’s pitch to reduce the opacity of the booking and payment process for models, a problem co-founder Benjamin Alexander Hori has said he knew first-hand during his own career. In the company’s telling, the new product gives users a way to upload an image, run a reverse search and produce a takedown report for material flagged as unauthorised. Spotlite also frames the release as part of a broader shift in the industry, with another summary describing the platform as moving from fair-pay tools into legal-tech territory.

The timing is notable because lawmakers have begun moving faster on synthetic media and digital replicas. Spotlite points to New York’s Fashion Workers Act, which took effect in June 2025, along with California and Illinois measures that it says strengthen consent-based protections for performers. At the federal level, the company cites the TAKE IT DOWN Act and the still-pending NO FAKES Act as evidence that regulation is catching up with the technology, though the practical burden of enforcement is still likely to fall heavily on creators and agencies themselves.

Source Reference Map

Inspired by headline at: [1]

Sources by paragraph:

Source: Noah Wire Services