On Thursday, October 12, a bipartisan group of senators—Chris Coons (D-Del.), Thom Tillis (R-N.C.), Marsha Blackburn (R-Tenn.), and Amy Klobuchar (D-Minn.)—released a Discussion Draft of the Nurture Originals, Foster Art, and Keep Entertainment Safe (dubbed the “NO FAKES”) Act that would protect the voice, image, or visual likeness of all individuals from unauthorized AI-generated digital replicas, also referred to as “deepfakes.” This draft bill, while focusing on protections required by the advancement of AI, would establish the first federal right of publicity—the right to protect and control the use of one’s voice, image, and visual likeness. The NO FAKES Act could have widespread impacts on the entertainment and media industries, among others.

Generative AI has opened new worlds of creative opportunities, but with these creative innovations also comes the ability to exploit another’s voice, image, or visual likeness by creating nearly indistinguishable digital replicas. This has caused great concern among musicians, celebrities, actors, and politicians regarding viral AI-created deepfakes circulating on social media and the Internet more broadly. To date, advancements in AI technology used to create digital replicas have outpaced the current legal framework governing unauthorized use. Although there are existing laws that may be used to combat digital replicas, these laws either vary from state to state, creating a patchwork of differing protections based on where one is located, or do not directly address the harms caused by producing and distributing unauthorized digital replicas.

The proposed legislation creates a new digital replication right that would squarely address this exploitation by allowing all individuals to authorize the use of their image, voice, or visual likeness in a digital replica. The new property right would be exclusive to the rights holder, freely descendible and licensable, survive post-mortem, and continue to apply for 70 years after the individual’s death.

Under this draft bill, individuals and companies would be liable if they produce a digital replica without prior consent. Individuals, platforms, and online Internet service providers are also liable if they host or share the unauthorized digital replica and they have knowledge that the digital replica was not authorized. Importantly, the digital replication right created by the NO FAKES Act is an intellectual property right, meaning that platforms that post or share unauthorized digital replicas are not protected under Section 230 of the Communications Decency Act (which currently shields platforms from liability for content posted by users of their sites, subject to certain exceptions). Additionally, including a disclaimer that the digital replica was unauthorized is not a defense; nor is the fact that an entity did not participate in the creation or sharing of the digital replica. Bottom line, as the draft bill currently stands: If you create or post an unauthorized digital replica, you could face liability.

As an obvious nod to concerns of media and First Amendment proponents, the discussion draft exempts certain uses from liability, including use as a part of (i) news, public affairs, or sports broadcast; (ii) documentary, historical, or biographical works; (iii) criticism, satire, parody, or comment; and (iv) advertisements regarding the above-listed uses.

As the draft bill is at an early stage, much could change before any law is enacted. And given the stakes, lawmakers should expect strong feedback from several industries. Will the digital replication right, and first federal right of publicity, come to fruition? Stay tuned.

For more information or assistance, contact Katie Wright Morrone.