2 Sources
[1]
George Clooney, Tom Hanks, and Meryl Streep back new 'Human Consent Standard' for AI licensing
Hollywood actors and producers are standing behind a new AI licensing standard that will tell AI systems whether they'll need to pay to use a person's likeness, creative work, characters, and designs. With the Human Consent Standard, people can set terms for the use of their work or likeness, including giving AI systems full permission to use their content, allowing access with certain requirements, or restricting access entirely. The Human Consent Standard builds upon the Really Simple Licensing (RSL) Standard, which launched last year as a way for websites to signal how AI systems use their work. RSL Media, a nonprofit cofounded by Cate Blanchett, is overseeing the Human Licensing Standard. The newly launched standard is backed by talent such as George Clooney, Viola Davis, Tom Hanks, Kristen Stewart, Steven Soderbergh, and Meryl Streep, along with organizations like the Creative Artists Agency and Music Artists Coalition. In an email to The Verge, RSL Media cofounder Eckart Walther says that, similar to the RSL Standard, AI systems can discover the Human Consent Standard through a website's robots.txt page, which tells web and AI crawlers whether they can scrape its content. But Walther says that while the "RSL usually applies to content at a specific URL," the Human Consent Standard "applies to the underlying work, identity, character, or mark itself, wherever it appears." AI systems will check this declaration against a registry launching in June, which will allow people to verify their identity and set permissions for the usage of their likeness and creative works. From there, RSL Media will "translate" these terms into signals that AI systems can read. "The purpose of the Registry is to give people and rights holders a trusted place to publish those declarations, so responsible AI systems can check whether a work, likeness, voice, character, or brand is allowed, prohibited, or requires permission," Walther says. Some artists and actors have already taken steps to combat the unauthorized use of their likenesses. Matthew McConaughey trademarked clips of himself, while Taylor Swift applied for a trademark of a photo of herself and two soundbites, where she says, "Hey, it's Taylor" and "Hey, it's Taylor Swift." With the launch of RSL Media, everyone can attempt to set permissions for their work. "RSL Media is a simple, effective and free solutions-based technology for facilitating and activating consent," Blanchett says in the press release. "It's also the industry's first practical solution where people everywhere, not just public figures, can assert control over how their work is used by AI."
[2]
Actors' new spec aims to defeat attack of the AI clones
RSL Media expands machine-readable licensing rules to cover AI use of identities and creative works AI models can take your written work, they can take your voice, and they can even take your likeness to use for training material and for creating content that looks exactly like it came from you. Now, some actors are promoting a new licensing spec designed to protect their famous faces and yours too. The public-benefit nonprofit behind the Really Simple Licensing (RSL) standard has expanded the project with the draft RSL Media Human Consent Standard (RSL-MEDIA) 1.0, which aims to cover creative works as well as people's names, likenesses, voices, and other identity attributes. The initial launch allows people to sign up and reserve an identifier that will serve as a key to structured data entered into the RSL Media public registry, scheduled to launch next month. The registry will allow people to verify their identities, set permissions governing the use of their works and likeness, encode those permissions for machine consumption, and verify that AI systems are checking declared permissions. Whether there will be any legal consequences for AI services that ignore registry settings remains to be seen. The data broker industry in the US hasn't exactly suffered due to the notional existence of "privacy rights." And public concern about non-consensual AI nudification and explicit deepfakes hasn't really put an end to that form of technological abuse or punished the social media sites distributing it. But this time, Hollywood has shown up. "AI technologies are expanding rampantly, essentially unchecked and unregulated," said celebrated actress and RSL Media co-founder Cate Blanchett, in a statement. "In order for humans to remain in front of these technologies, consent must be the first consideration. RSL Media is a simple, effective and free solutions-based technology for facilitating and activating consent. It's also the industry's first practical solution where people everywhere, not just public figures, can assert control over how their work is used by AI." Nikki Hexum, co-founder and CEO of RSL Media, said, "AI can't respect rights it can't see, and this means human consent is virtually invisible in this new digital era. The right to decide whether AI can use your work or identity should not be reserved for only those who can afford lawyers or have platforms big enough to be heard, it is a basic human right." That's not entirely correct. Rights do not need to be seen to be respected; due diligence prior to using material that may be copyrighted is expected. Ignorance of copyright does not excuse infringement, even if it might mitigate potential liability. AI model makers could have chosen to respect rights by default, by seeking permission to use data for training. They could have chosen to seek permission to crawl websites and could have heeded existing signals to crawlers like the Robots Exclusion Protocol. They could have chosen to abide by the requirements of open source software licenses in harvested code. They did not do so, because Silicon Valley prefers to ask forgiveness rather than seek permission. Permission is expensive; there wouldn't be much of an AI industry if that were the norm. The law may be one of the things broken by those applying Meta's shelved mantra "move fast and break things." So far, industry disinterest in seeking permission has worked well - AI companies have been held to account in only a few of the hundred-plus lawsuits objecting to AI content capture. The underlying RSL standard is slowly gaining adoption. The RSL Collective says more than 1,500 media organizations, brands, technology companies, and standards groups now support it following the launch of RSL 1.0 last December and the relevant RSL XML file can be seen at sites like The Guardian. While it's unclear what impact the RSL has had on AI biz behavior, extending the RSL to cover personal identity with the RSL-MEDIA standard may stir broader interest in AI rules and their enforcement. Or it may just affirm the XKCD comic about specifications and how they proliferate. There are already several similar protocols: TDM AI and TDMRep, Spawning's ai.txt, AI Preferences, not to mention a few that focus solely on images and commercial offerings like Cloudflare's Pay per crawl. But RSL Media may have a leg up thanks to the involvement of high-profile celebrities like Blanchett and endorsements from similarly well-known peers. "Of course artists and cultural creatives will inevitably be involved with AI," said Dame Emma Thompson in a statement. "At the moment, however, AI is merely stealing from us all. This is an urgent and essential initiative. It's also eminently doable, so let's do it without delay." ®
Share
Copy Link
Hollywood's biggest names are backing a new standard that lets people control how AI systems use their likeness and creative work. The Human Consent Standard, overseen by nonprofit RSL Media, builds on Really Simple Licensing to give everyone—not just celebrities—the power to set permissions for AI training. A public registry launches in June to verify identities and translate consent into machine-readable signals.
A coalition of Hollywood's most prominent actors and producers is backing the Human Consent Standard, a new AI licensing framework designed to give people control over how AI systems use their likeness, creative work, characters, and designs
1
. George Clooney, Tom Hanks, Meryl Streep, Viola Davis, Kristen Stewart, and Steven Soderbergh have all endorsed the initiative, alongside organizations like the Creative Artists Agency and Music Artists Coalition1
. The standard allows individuals to set specific terms for their content—granting full permission, allowing access with requirements, or restricting access entirely to protect individuals' identities from unauthorized use by AI models1
.
Source: The Verge
The Human Consent Standard builds upon the Really Simple Licensing (RSL) Standard, which launched last year to help websites signal how AI systems can use their content
1
. RSL Media, a nonprofit cofounded by Cate Blanchett, is overseeing this expansion through the draft RSL Media Human Consent Standard (RSL-MEDIA) 1.0 . Unlike the original RSL Standard, which applies to content at specific URLs, the Human Consent Standard applies to the underlying work, identity, character, or mark itself, wherever it appears1
. AI systems can discover these permissions through a website's robots.txt page, which tells web and AI crawlers whether they can scrape content1
.A public registry scheduled to launch in June will serve as the backbone for consent-based AI training practices . The registry will allow people to verify their identity and set permissions for the usage of their likeness, voices, and creative work
1
. RSL Media will translate these terms into machine-readable signals that AI systems can check before using someone's work or identity1
. According to RSL Media cofounder Eckart Walther, the purpose is to give people and rights holders a trusted place to publish declarations, so responsible AI systems can determine whether a work, likeness, voice, character, or brand is allowed, prohibited, or requires permission1
.
Source: The Register
Related Stories
The initiative comes as AI models increasingly scrape written work, voices, and likenesses for training material without permission . Some artists have already taken individual steps—Matthew McConaughey trademarked clips of himself, while Taylor Swift applied for a trademark of a photo and soundbites
1
. But the Human Consent Standard aims to democratize this protection. "The right to decide whether AI can use your work or identity should not be reserved for only those who can afford lawyers or have platforms big enough to be heard, it is a basic human right," said Nikki Hexum, co-founder and CEO of RSL Media . Blanchett emphasized that the standard is "the industry's first practical solution where people everywhere, not just public figures, can assert control over how their work is used by AI"1
.Whether AI companies will respect the Human Consent Standard remains to be seen, as the tech industry has historically preferred to ask forgiveness rather than seek permission . The underlying RSL standard has gained adoption from more than 1,500 media organizations, brands, and technology companies since RSL 1.0 launched last December . However, the standard enters a crowded field that includes TDM AI, TDMRep, Spawning's ai.txt, and Cloudflare's Pay per crawl . Emma Thompson framed the urgency bluntly: "At the moment, however, AI is merely stealing from us all. This is an urgent and essential initiative" . The involvement of high-profile celebrities may give RSL Media an advantage in drawing attention to consent requirements, though legal consequences for AI services that ignore registry settings remain unclear .
Summarized by
Navi
[1]
[2]
22 Jan 2026•Entertainment and Society

04 Sept 2024

14 Jan 2026•Entertainment and Society

1
Technology

2
Health

3
Policy and Regulation
