


3 Sources
3 Sources
[1]

Jake Paul's Deepfake Gambit Sparks Debate over Sora Cameos and Digital Likeness Rights
Jake Paul is all over the Internet. Paul -- social media presence turned actor and professional boxer -- made history last year when he squared off with boxing legend Mike Tyson in the most streamed sporting event ever. Now he's back in the limelight, with even more eyes on him. A wave of viral videos posted last week shows him giving makeup tutorials, shoplifting from Taco Bell and holding up a 7-Eleven. But none of these videos are real -- they're deepfakes minted on OpenAI's Sora app. The software, which launched September 30, uses artificial intelligence to generate videos and allows people to upload their image as a "cameo" that others can use. Paul willingly uploaded his cameo for others to use, and then on October 8 he posted a TikTok video in which he threatened to sue anyone spreading deepfakes of him doing things he would never do. As he said this, he began clumsily applying makeup -- a gag because this is what many of the deepfakes portrayed him doing. The next day he announced on X that he was a "proud OpenAI investor" and "the first celebrity NIL cameo user" (NIL is the acronym for name, image and likeness) and that the videos generated with his likeness had, in just six days, received more than a billion views. If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today. Such endeavors may be the start of a new digital economy for deepfakes -- and Sora may be the key driver. The "cameo" feature, which at first seemed a fun add-on to Sora's abilities, now appears to be one of the main attractions, and OpenAI's CEO has announced plans to monetize it. The development presents opportunities for some but big risks for others. Depending on how the system is implemented, cameo owners could set terms and cost, which could give some control back to those whose images have been used without their consent. Anyone could share their NIL and, in effect, put their digital double in the equivalent of a stock photo library. Other app users could license these likenesses in small, trackable uses, adhering to rules set by rights holders (no nudity, for example) and paying them for each use. Deepfakes, which until now have largely been used to defame or extort people, would at least be generating royalties for their subjects. Of course, with the technology moving faster than many users realize -- and certainly faster than regulators can keep up with -- the risks are substantial. A person could consent to be duplicated and still be harmed by deepfakes that are selectively edited or made with malicious prompts. Their image could easily be stolen and used elsewhere to defraud or misrepresent, and exposed biometric data can't simply be reset like a leaked password. Establishing a market for NILs doesn't resolve the potential harms, but it may create a commercial incentive to prevent them through regulation to maintain the integrity of the marketplace. And in some cases, using a person's NIL will be impossible for moral reasons, not just legal ones. In the first week of Sora's launch, families protested deepfakes of the dead, and OpenAI signaled that it would add tools to honor those requests. And a deepfake economy will likely change things in ways no one can foresee. Music streamers such as Spotify and SoundCloud transformed the music industry, changing how songs are shared and even designed. For instance, they moved musicians toward recording shorter songs that listeners are less likely to skip. Will something similar happen with commodified deepfakes? A marketplace may evolve where people's images are sold or traded or even adjusted depending on how in demand they become. Would face image values rise and drop with the fluctuations of one's popularity? It sounds dystopian, but we already live in a culture where image and attention are monetized, and that trend appears likely to keep evolving. A number of celebrities have already begun exploring commercial deepfakes. In 2023 the musician Grimes offered a 50-50 royalty split to anyone who used her AI voice to create a "successful" song. YouTube's Dream Track project lets creators make soundtracks with the AI singing voices of Charlie Puth, Demi Lovato and John Legend, among others. Deepfakes of sports stars David Beckham and Peyton Manning have appeared in commercials, and musician FKA twigs created a deepfake to handle her social media interactions while she focuses on making music. Which brings us back to the logic of Paul's decision to let people make videos with his cameo on Sora: if attention is the scarce prized good in a saturated digital market, then letting the world generate you on demand gets you more eyes and increases your value. He is both asset and architect, earning attention today so he can be paid in royalties tomorrow.
[2]

Jake Paul Invites Users to Fake Him on Sora, So They Immediately Use It to Make Him Gay and Obsessed With Makeup
OpenAI's recent launch of its text-to-video AI generator app, Sora 2, has already led to an enormous tidal wave of AI slop hitting internet feeds. The company made the eyebrow-raising decision of putting deepfakes front and center of the app's TikTok-like experience, inviting users to offer themselves up for "cameos" created by other users. "With cameos, you can drop yourself straight into any Sora scene with remarkable fidelity after a short one-time video-and-audio recording in the app to verify your identity and capture your likeness," the company boasted in its announcement. And while not everybody was willing to become the butt of the joke, like OpenAI CEO Sam Altman -- whose visage quickly adorned AI-generated and unnerving CCTV footage of him shoplifting or his head popping out of a toilet -- some internet personalities were willing to throw themselves at the mercy of meme lords everywhere. Influencer Jake Paul seemingly was one of them, and the results were as outlandish as you might expect. Seemingly inspired by a deepfake video of him passionately kissing upcoming UFC opponent Gervonta Davis last month, users quickly started sharing clips of him coming out of the closet and giving makeup tutorials. Paul took the clips in stride, initially putting on a grave voice and decrying that "this AI is getting out of hand," only to buy into the trend by acting camp in a response video posted to TikTok on Monday. But his girlfriend, Dutch professional speed skater Jutta Leerdam, wasn't impressed. "I don't like it, it's not funny!" she told him in a video. "People believe it." Problematic undercurrents of homophobia aside, the trend paints a troubling picture of a future filled with photorealistic and eerily believable AI slop. The Sora app is only the latest demonstration that the tech is continuing to blur the lines between reality and a synthetic parallel universe dreamed up by generative AI. While Paul is basking in the limelight, many netizens have watched in horror as the internet continues to be overtaken by lowbrow slop. "Please, just stop sending me AI videos of Dad," Zelda Williams, daughter of the late Hollywood comedy icon Robin Williams, wrote in a recent Instagram Stories post. "Stop believing I wanna see it or that I'll understand, I don't and I won't." "AI is just badly recycling and regurgitating the past to be reconsumed," she added. "You are taking in the Human Centipede of content, and from the very, very end of the line, all while the folks at the front laugh and laugh, consume and consume." Other users are finding that their faces are being used even without having opted in, like Paul, suggesting OpenAI's "cameos" feature isn't nearly as safe as the company makes it out to be. "It is scary to think what AI is doing to feed my stalker's delusions," journalist Taylor Lorenz tweeted, revealing that her "psychotic stalker" was using Sora to generate videos of her. Sora has also fanned the flames of a heated debate surrounding the unauthorized use of copyrighted materials, with users generating copious clips of SpongeBob SquarePants and "South Park" characters. OpenAI eventually opted to come down hard on the trend, implementing guardrails that users now say make the app "completely boring and useless." In the meantime, Paul's own position seems a little ambivalent. "I've had it with the AI stuff," he said in a Wednesday video. "It's affecting my relationships, businesses." "It's really affecting things, and people really need to get a life," he added -- while haphazardly applying foundation to his cheeks using a makeup brush.
[3]

Jake Paul Opts In to Sora 2, and Chaos Follows
As most of mainstream Hollywood looks to opt out of OpenAI's new AI video app Sora, one content creator and boxer has clearly opted in and appears to be enjoying the free publicity. Since the launch of Sora 2 at the beginning of October, an app that allows users to generate hyperrealistic clips of not only themselves but also other permitted users, videos have quickly begun to flood social media, specifically TikTok. And there's one face that is notably being used the most: Jake Paul. In recent days, it's been quite difficult to scroll on the popular app and not see an AI video of Paul -- from him causing a scene on an airplane to being confronted by police over a hit-and-run. But the most common video theme has been videos of the boxer if he were a gay man who loves fashion and makeup (The real Paul is straight and currently engaged to Olympic speed skater Jutta Leerdam). Some people may be annoyed with thousands of fake videos of themselves flooding the internet, but Paul seems to be amused by it. I mean, he did have to opt in to have his likeness used through Sora. The youngest of the Paul brothers (Logan Paul is Jake's older brother) has already taken to his personal social media to respond to the abundance of AI videos, and has a good sense of humor about it all. He posted a TikTok video on Wednesday, saying in a serious tone, "I've had it with the AI stuff. It's affecting my relationship, businesses. People are hitting me up saying, 'Yo, did you say this? What did you do this for? I can't believe you did this?' It's really affecting things, and honestly, it's like people need to get a life, so it's kind of pissing me off. I'm gonna be suing everybody that is continuing to spread these false narratives of me doing shit that I would literally never, ever do. So be ready for lawsuits." However, as he's saying all of this, he's actually putting on makeup, just like in some of the AI videos. Paul also shared a Sora-created video on his Instagram Story of him appearing to have a meltdown at a Starbucks after they got his order wrong. He jokingly wrote on the post, "Surprised someone got this on camera this morning -- what happened to privacy?" The Hollywood Reporter has reached out to Paul's rep for comment. Though Paul is enjoying the new AI video app (and is likely hoping to benefit from the first-user advantage when it comes to new social networks), there are plenty of others not exactly thrilled with the technology, especially in Hollywood. Major studio executives and talent agency chiefs have already raised concerns over Sora 2 and how their intellectual property or likenesses are being used on the app. Charles Rivkin, CEO of the Motion Picture Association, recently called on OpenAI to "prevent infringement" of "our members' films, shows, and characters." WME's head of digital strategy, Chris Jacquemin, has also said they were opting out all of the agency's clients from the latest update of the video tool. However, OpenAI CEO Sam Altman has promised "to give rightsholders more granular control" over their IP. Robin Williams' daughter, Zelda Williams, also recently slammed people who are making AI-generated videos of her late father, calling the clips "gross." She added, "To watch the legacies of real people be condensed down to 'this vaguely looks and sounds like them so that's enough', just so other people can churn out horrible TikTok slop puppeteering them is maddening. You're not making art, you're making disgusting, over-processed hotdogs out of the lives of human beings, out of the history of art and music, and then shoving them down someone else's throat hoping they'll give you a little thumbs up and like it."
Share
Share
Copy Link
Jake Paul's participation in OpenAI's Sora 2 app ignites discussions on digital likeness rights, the potential for a deepfake economy, and the ethical implications of AI-generated content.
Jake Paul, the social media personality turned actor and boxer, has made headlines once again by becoming the first celebrity to participate in OpenAI's Sora 2 app's 'cameo' feature. This move has sparked a viral trend of AI-generated videos featuring Paul in various outlandish scenarios, from makeup tutorials to shoplifting
1
2
.
Source: Futurism
Paul's willingness to have his likeness used in these AI-generated videos has led to an explosion of content, with the influencer claiming that videos featuring his cameo received over a billion views in just six days
1
. While Paul initially appeared to embrace the trend, even creating response videos that played into the memes, he later expressed frustration with the AI-generated content's impact on his relationships and businesses3
.
Source: The Hollywood Reporter
Paul's participation in Sora 2 has ignited discussions about the potential for a new digital economy centered around deepfakes. OpenAI's CEO has announced plans to monetize the 'cameo' feature, which could allow individuals to set terms and costs for the use of their digital likeness
1
.This development presents both opportunities and risks. On one hand, it could give individuals more control over their digital image and potentially generate royalties from its use. On the other hand, it raises concerns about consent, misuse, and the potential for fraud or misrepresentation
1
.The rapid advancement of AI-generated video technology has outpaced regulatory efforts, leading to calls for better safeguards and clearer guidelines. Major studio executives and talent agency chiefs have raised concerns about intellectual property rights and the unauthorized use of likenesses
3
.Charles Rivkin, CEO of the Motion Picture Association, has called on OpenAI to prevent infringement of copyrighted materials, while WME's head of digital strategy, Chris Jacquemin, announced that they were opting out all of the agency's clients from the latest update of the video tool
3
.Related Stories
While some, like Jake Paul, may see potential benefits in the technology, others have expressed serious concerns. Journalist Taylor Lorenz revealed that her stalker was using Sora to generate videos of her, highlighting the potential for abuse
2
. Zelda Williams, daughter of the late Robin Williams, also spoke out against AI-generated videos of her father, calling them 'gross' and emphasizing the emotional toll such content can take on families of deceased celebrities3
.As the technology continues to evolve, questions remain about how the marketplace for digital likenesses will develop. Will face image values fluctuate based on popularity? How will this impact the entertainment industry and personal branding? While these questions remain unanswered, it's clear that the rise of AI-generated content is reshaping our understanding of digital identity and intellectual property in the age of artificial intelligence.
Summarized by

Navi
[1]
[3]
14 Oct 2025•Technology

04 Oct 2025•Technology

17 Oct 2025•Technology
