3 Sources
[1]
Hollywood's A.I. Battle Is Your Battle Too
Every business is now an IP business. Every CEO is now a target to Agent-ify -- like it or not. There's been no shortage of Hollywood A.I. headlines: Scarlett Johansson's voice cloned without permission. Studios scrambling to secure their content before it gets scraped by the next foundation model. Celebrities' likenesses appearing in deepfake videos that spread like wildfire across social platforms. But if you think these challenges are Hollywood's problems, you're dangerously mistaken. Sign Up For Our Daily Newsletter Sign Up Thank you for signing up! By clicking submit, you agree to our <a href="http://observermedia.com/terms">terms of service</a> and acknowledge we may use your information to send you emails, product samples, and promotions on this website and other properties. You can opt out anytime. See all of our newsletters The entertainment industry's A.I. battles are the canary in the coal mine for every business leader who thinks their company operates outside the content economy. Spoiler alert: you don't. Every company is now both a content company and an IP business. Every CEO is now a media brand. And the same forces reshaping Hollywood are already targeting your boardroom, your proprietary data and your quarterly results. Now it's up to you to both avoid the landmines and find the goldmines in the new age of A.I. Everyone's IP is up for grabs Consider this: if Google can train Veo3 on movie trailers and OpenAI can crib from leading publishers, what's stopping powerful A.I. models from ingesting your company's outcomes, business journals, proprietary research and executives' name, image and likeness (NIL)? A lot less than you would hope. Right now, we're witnessing an unprecedented extraction of IP on a global scale. While most people accept that foreign A.I. firms like DeepSeek could strip-mine creative content to fuel their models, domestic competitors and A.I. companies are already eyeing the valuable data exhaust your business generates daily. Your executive calls, product documentation, customer interactions and strategic communications aren't just internal assets anymore -- they're training data. Just look at the financial industry. Fraudsters have made headlines for using A.I. technology to impersonate leading executives and extract large payments. More broadly, Jamie Dimon's quarterly earnings calls, JPMorgan's research reports and internal risk assessments represent exactly the kind of high-value, domain-specific content that specialized A.I. models crave. These aren't just corporate communications, but when in the aggregate, they're competitive intelligence worth billions. When an A.I. model can analyze and synthesize decades of your strategic thinking, what's your moat? Beyond the C-Suite: the hidden vulnerabilities The risks extend far beyond public executive communications. Pharmaceutical companies are sitting on clinical trial data, regulatory submissions and research methodologies that are potential gold mines for these models. Consumer goods companies have customer insights, market research and brand strategies that competitors would pay millions to access. SaaS companies have usage patterns, customer success playbooks and product development roadmaps that reveal their entire competitive strategy. Yet so many companies don't recognize that their lucrative data is already being harvested. While A.I. discussions are hard to avoid in Hollywood, plenty of industries are sleepwalking into an IP crisis. They're treating A.I. as a productivity tool while their most valuable assets get scraped, trained on and redistributed without their knowledge or consent. It's not just entertainment -- it's every industry. Playing defense: securing your data footprint Every CEO needs an immediate A.I. defense strategy that goes beyond basic cybersecurity. This means conducting a comprehensive audit of your data footprint -- not just what you store, but what you share, publish and inadvertently expose. Your investor presentations, patent filings, conference talks and even job postings are all fair game for A.I. training. Consider your extended digital presence, too. When you partner with influencers or creators, you're not just sharing your brand, you're potentially feeding A.I. systems. YouTube's terms of service, for example, allow the platform to use content from influencer videos featuring your products or executives for A.I. training purposes (a policy updated in December 2024 to give creators and rights holders the option to opt in or out of allowing third-party companies to use their videos for model training). Start with NIL protection for your executives. If Scarlett Johansson's voice can be cloned, so can your CEO's. Implement content authentication protocols for all external communications. Extend these protocols to influencer partnerships by establishing clear contractual terms about data usage and A.I. training rights. Establish clear data governance policies that specify how your information can and cannot be used for A.I. training. Most importantly, don't rely on legal contracts or new legislation to save the day. The technology exists to monitor and enforce your data usage policies -- you just need to deploy it. Playing offense: monetizing your data assets But defense alone isn't enough. The companies that will thrive in the A.I. era are those that recognize the offensive opportunity hiding in their data exhaust. Your customer service transcripts could train industry-specific A.I. assistants. Your decades of market research could power business intelligence models. Your executive insights could become licensed content for leadership development platforms. The key is establishing licensing frameworks before someone else decides your data's value for you. Just as Hollywood studios are now demanding better terms from A.I. companies, every business should explore how to monetize its accumulated data and expertise. This isn't just about protecting what you have -- it's about recognizing what you've been giving away for free. How many firms could launch A.I.-powered advisory services based on your historical client case studies? How many manufacturers could license their process optimization data to train industrial A.I. systems? The shifting regulatory landscape The policy landscape is constantly shifting. Prime Minister Keir Starmer, who wants the U.K. to become an A.I. superpower, has proposed relaxing copyright laws to allow A.I. developers to train their models. A slew of laws will hopefully help hold criminals and negligent platforms accountable for fostering deepfakes. But scattered macro-level protections won't help if you haven't secured your own house first. Companies must proactively fight for favorable licensing terms and protection protocols before the window closes. Once the next generation of A.I. models is trained on your unprotected data, critical leverage is lost. The new reality The harsh truth is that every business leader must now think like a media executive. Your company's content, data and IP are valuable assets that require active management and monetization strategies. Your executives' expertise and insights are media properties that need protection and strategic deployment. This isn't a choice -- it's the new reality of doing business in an A.I.-powered economy. The companies that recognize this early and build appropriate defenses and licensing strategies will maintain their optionality and competitive advantages. Those that don't will find their most valuable assets powering their competitors' A.I. systems. Hollywood's A.I. battle is your battle because the forces reshaping entertainment are reshaping everything. The question isn't whether A.I. will transform your industry -- it's whether you'll control that transformation or become its victim. Dan Neely, CEO of Vermillio, was named to the TIME 100 list of the most influential people in AI.
[2]
Why the Disney AI Lawsuit Will Determine the Future of Studios
A New York Startup Just Threw a Splashy Event to Hail the Future of AI Movies Those were the two reactions -- seemingly opposite, actually harmonious -- to the news that Disney and Universal had finally bitten the bullet Wednesday and sued an AI company, the startup image-generator Midjourney. It had, after all, been nearly 18 months since The New York Times dropped the first shoe, suing OpenAI and its backer Microsoft in December 2023 over alleged unlawful training of its models on Times journalism. As the months dripped by and the Times lawsuit withstood key court challenges -- "this is just fair use," the AI firms cried -- it made sense that Disney and its studio peers would follow their lead. And then became uncomfortable when they didn't. What was the other shoe waiting for? Some media companies were cutting deals to license content to AI models -- a Dotdash Meredith here, a Vox Media there. But clearly Disney and the Hollywood crew wouldn't do that -- the stakes were too high, their legal options too varied. So it was with an air of inevitability that Disney and Universal filed, alleging a "bottomless pit of plagiarism" in the Shrek- and Yoda-like creatures that Midjourney spits out because it has been trained on, well, Shrek- and Yoda-like images. "Piracy is piracy," Disney's chief legal officer said in an accompanying statement. And yet, for all the waiting, this was but the first shot -- even an early symbol -- of what will almost certainly be a larger war, with other companies joining up to go after Midjourney, and perhaps Disney and Universal going after other companies. (Midjourney is the least capitalized and weakest of the Gen AI bunch, which is probably why it was targeted.) If 2023 was defined by authors suing AI firms and 2024 was marked by a bunch of media outfits doing the same, 2025 could shape up as the year the studios confront the silicon. Which brings us to 2026. And beyond. Because this lawsuit isn't your standard intellectual-property dispute. It goes to the heart of what studios are -- and what their owners ultimately want them to be. Let's play this out. There are several ways the lawsuit unfolds. The most obvious is the way of most lawsuits -- with a settlement. In this scenario, Midjourney (and no doubt other AI model operators) pay the studios for their infringement and strike a deal to keep on licensing. (They're never going to yank studio fare from their models - by the executives' own words the models would collapse without Big Content.) So AI models keep getting trained on, and spitting our facsimiles of, Hollywood material. Similarly, studios could simply lose. That nets them less money, but it ends in the same place: OpenAI, Google Gemini and the others crank out Hollywood-trained content at will. Then there's the other way: With a studio legal victory. The AI models are deemed prohibited from training on this content -- this "fair use," a judge says, ain't that. In such a scenario we are ensured that for the indefinite future what gets generated in the way of Hollywood images comes from Hollywood and Hollywood alone. What does this lead to? Well, it leads to studios continuing to do what they have always done -- being the main incubators for and generators of so much of the film, television and other entertainment we consume. And what does the first option lead to? Well, it hardly takes an imaginative leap to see where we end up if anyone can go to an AI model and plug in prompts to generate stuff that looks a lot like the movies and television we know. It means the end of studios doing it for us. I know that can seem like a bold statement, but it really isn't. Once content gets automated like that for the masses, there is no need or incentive for studios to do it themselves. Why would you maintain a whole infrastructure to generate original content - entire deals and offices and hierarchies of development and production - when your audiences can get so much of what they want by going directly to the model? If you thought TikTok creators were challenging studios now, imagine when they can just utter a few words and get the next Yoda. Sure, there'd be some boutique operations to do something new even in such a circumstance; originality gonna originate. But it would be the exception. Plus that stuff would eventually get devoured by the maw too since AI models could just grab it to train on. You see how this game goes. And if this whole vision leads you to say "that doesn't exactly seem like it will produce the next Godfather or Star Wars!" - well no, it won't. Capitalism sends its regrets. It'll lead to some cool stuff, sure; creativity isn't dead. The next MrBeast? Man he's got some tools. And some filmmakers will have some fun; we're already seeing what Harmony Korine and Darren Aronofsky could do with these things. But studios as we know them? Nope. In this scenario, Hollywood studios morph into something else: IP rights managers. They're still here to make money off the property they created. Disney is still running theme parks, for instance, and people are still employed on the lot to work with the AI companies to make sure everything runs smoothly and the checks come in on time. But the idea of a studio as we think of it, as it has existed for a century - the idea of a Dream Factory in any meaningful sense of the term -- it's gone. We've woken up. Disney stops being what Walt Disney founded Disney to be. In fact, you wouldn't really need a lot, come to think of it. I don't think Bob Iger wants to be the guy to do that to his company. That's why I don't see him settling. But the imperatives of the dollar are strong. And if his legal advisors are saying he might lose... Plus some of you cynics out there might say studios have been heading in the IP-management direction for a while now. Incidentally the studios are in their own interesting position because they want to use AI themselves. They may not be tech companies, eager to rip through content libraries so they can hawk products based on the scavenged. But they're not actors and writers either, trying to protect human endeavor. If they can make the next Avatar at a fraction of the cost? "Sign us up for that AI model!" ("Just as long as you don't feed our stuff into it.") Yeah, the only way these models will be good enough for the studios to use is if they train on the data the studios don't want them to have. Catch-22. A pretty good studio movie, by the way. So here we are, an L.A. District court holding the future of Hollywood in its hands. What would you do if you were the judge? What would you do if you were an executive? Take the cash given all these headwinds and pivot your model? Or stand pat and try to preserve the concept of a studio as it's always been constructed, knowing full well you could end up with neither the old way or the new money? It's a juicy question. And one drama, at least, that ChatGPT couldn't engineer.
[3]
Why Disney's AI Lawsuit Will Determine Whether Studios Survive
Five Ways the Donald Trump-Elon Musk Fight Could Impact Hollywood Those were the two reactions -- seemingly opposite, actually harmonious -- to the news that Disney and Universal had finally bitten the bullet Wednesday and sued an AI company, the startup image-generator Midjourney. It had, after all, been nearly 18 months since The New York Times dropped the first shoe, suing OpenAI and its backer Microsoft in December 2023 over alleged unlawful training of its models on Times journalism. As the months dripped by and the Times lawsuit withstood key court challenges -- "this is just fair use," the AI firms cried -- it made sense that Disney and its studio peers would follow their lead. And then became uncomfortable when they didn't. What was the other shoe waiting for? Some media companies were cutting deals to license content to AI models -- a Dotdash Meredith here, a Vox Media there. But clearly Disney and the Hollywood crew wouldn't do that -- the stakes were too high, their legal options too varied. So it was with an air of inevitability that Disney and Universal filed, alleging a "bottomless pit of plagiarism" in the Shrek- and Yoda-like creatures that Midjourney spits out because it has been trained on, well, Shrek- and Yoda-like images. "Piracy is piracy," Disney's chief legal officer said in an accompanying statement. And yet, for all the waiting, this was but the first shot -- even an early symbol -- of what will almost certainly be a larger war, with other companies joining up to go after Midjourney, and perhaps Disney and Universal going after other companies. (Midjourney is the least capitalized and weakest of the Gen AI bunch, which is probably why it was targeted.) If 2023 was defined by authors suing AI firms and 2024 was marked by a bunch of media outfits doing the same, 2025 could shape up as the year the studios confront the silicon. Which brings us to 2026. And beyond. Because this lawsuit isn't your standard intellectual-property dispute. It goes to the heart of what studios are -- and what their owners ultimately want them to be. Let's play this out. There are several ways the lawsuit unfolds. The most obvious is the way of most lawsuits -- with a settlement. In this scenario, Midjourney (and no doubt other AI model operators) pay the studios for their infringement and strike a deal to keep on licensing. (They're never going to yank studio fare from their models - by the executives' own words the models would collapse without Big Content.) So AI models keep getting trained on, and spitting our facsimiles of, Hollywood material. Similarly, studios could simply lose. That nets them less money, but it ends in the same place: OpenAI, Google Gemini and the others crank out Hollywood-trained content at will. Then there's the other way: With a studio legal victory. The AI models are deemed prohibited from training on this content -- this "fair use," a judge says, ain't that. In such a scenario we are ensured that for the indefinite future what gets generated in the way of Hollywood images comes from Hollywood and Hollywood alone. What does this lead to? Well, it leads to studios continuing to do what they have always done -- being the main incubators for and generators of so much of the film, television and other entertainment we consume. And what does the first option lead to? Well, it hardly takes an imaginative leap to see where we end up if anyone can go to an AI model and plug in prompts to generate stuff that looks a lot like the movies and television we know. It means the end of studios doing it for us. I know that can seem like a bold statement, but it really isn't. Once content gets automated like that for the masses, there is no need or incentive for studios to do it themselves. Why would you maintain a whole infrastructure to generate original content - entire deals and offices and hierarchies of development and production - when your audiences can get so much of what they want by going directly to the model? If you thought TikTok creators were challenging studios now, imagine when they can just utter a few words and get the next Yoda. Sure, there'd be some boutique operations to do something new even in such a circumstance; originality gonna originate. But it would be the exception. Plus that stuff would eventually get devoured by the maw too since AI models could just grab it to train on. You see how this game goes. And if this whole vision leads you to say "that doesn't exactly seem like it will produce the next Godfather or Star Wars!" - well no, it won't. Capitalism sends its regrets. It'll lead to some cool stuff, sure; creativity isn't dead. The next MrBeast? Man he's got some tools. And some filmmakers will have some fun; we're already seeing what Harmony Korine and Darren Aronofsky could do with these things. But studios as we know them? Nope. In this scenario, Hollywood studios morph into something else: IP rights managers. They're still here to make money off the property they created. Disney is still running theme parks, for instance, and people are still employed on the lot to work with the AI companies to make sure everything runs smoothly and the checks come in on time. But the idea of a studio as we think of it, as it has existed for a century - the idea of a Dream Factory in any meaningful sense of the term -- it's gone. We've woken up. Disney stops being what Walt Disney founded Disney to be. In fact, you wouldn't really need a lot, come to think of it. I don't think Bob Iger wants to be the guy to do that to his company. That's why I don't see him settling. But the imperatives of the dollar are strong. And if his legal advisors are saying he might lose... Plus some of you cynics out there might say studios have been heading in the IP-management direction for a while now. Incidentally the studios are in an especially interesting position because they want to use AI themselves. They may not be tech companies, eager to rip through content libraries so they can hawk products based on the scavenged. But they're not actors and writers either, trying to protect human endeavor. If they can make the next Avatar at a fraction of the cost? "Sign us up for that AI model!" ("Just as long as you don't feed our stuff into it.") Yeah, the only way these models will be good enough for the studios to use is if they train on the data the studios don't want them to have. Catch-22. A pretty good studio movie, by the way. So here we are, an L.A. District court holding the future of Hollywood in its hands. What would you do if you were the judge? What would you do if you were an executive? Take the cash given all these headwinds and pivot your model? Or stand pat and try to preserve the concept of a studio as it's always been constructed, knowing full well you could end up with neither the old way or the new money? It's a juicy question. And one drama, at least, that ChatGPT couldn't engineer.
Share
Copy Link
Disney and Universal have filed a lawsuit against AI image generator Midjourney, alleging copyright infringement. This legal action could have far-reaching implications for the future of content creation and the role of traditional studios in the entertainment industry.
In a significant move that could shape the future of the entertainment industry, Disney and Universal have filed a lawsuit against Midjourney, an AI image generation company. The studios allege that Midjourney has engaged in a "bottomless pit of plagiarism" by creating images resembling iconic characters like Shrek and Yoda 1.
Source: The Hollywood Reporter
The lawsuit against Midjourney is not just about protecting specific characters or images. It represents a larger battle between traditional content creators and the rapidly evolving AI technology sector. As AI models become more sophisticated in generating content that mimics human-created work, the very foundation of creative industries is being challenged 1.
The resolution of this lawsuit could lead to several scenarios, each with profound implications for the entertainment industry:
Settlement or Studio Loss: If the studios settle or lose the case, it could pave the way for AI models to continue training on and generating content based on copyrighted material. This outcome might lead to a future where anyone can create content that closely resembles popular franchises and characters 1.
Studio Victory: A win for Disney and Universal could prohibit AI models from training on copyrighted content without permission. This would preserve the traditional role of studios as primary content creators and protect their intellectual property 1.
The lawsuit highlights a growing concern across various industries about the protection of intellectual property in the age of AI. Companies in sectors ranging from pharmaceuticals to consumer goods are realizing that their valuable data and proprietary information could be at risk of being scraped and used to train AI models 2.
As the threat of unauthorized AI use of intellectual property grows, companies are being advised to take proactive measures:
The outcome of this lawsuit could determine whether traditional studios continue to exist as we know them or evolve into something entirely different. In a scenario where AI-generated content becomes prevalent, studios might transform into IP rights managers rather than content creators. This shift would fundamentally alter the century-old concept of Hollywood studios as "Dream Factories" 1.
As the legal battle unfolds, the entertainment industry and beyond are watching closely. The resolution of this case could set a precedent that will shape the future of content creation, intellectual property rights, and the role of AI in creative industries for years to come.
Google is experimenting with AI-generated audio summaries of search results, bringing its NotebookLM feature to the main search platform. This new tool offers users a podcast-like experience for digesting search information.
10 Sources
Technology
21 hrs ago
10 Sources
Technology
21 hrs ago
The article discusses the surge in mergers and acquisitions in the data infrastructure sector, driven by the AI race. Legacy tech companies are acquiring data processing firms to stay competitive in the AI market.
3 Sources
Business and Economy
13 hrs ago
3 Sources
Business and Economy
13 hrs ago
ManpowerGroup's Chief Innovation Officer discusses how AI is transforming recruitment and the skills employers will seek in the future, highlighting the need for soft skills and potential over traditional credentials.
2 Sources
Business and Economy
5 hrs ago
2 Sources
Business and Economy
5 hrs ago
A New Hampshire jury acquitted Steven Kramer, a political consultant, of all charges related to AI-generated robocalls mimicking President Biden. The case highlights the challenges in regulating AI use in political campaigns and raises questions about the future of AI governance.
4 Sources
Technology
21 hrs ago
4 Sources
Technology
21 hrs ago
Google introduces new Gemini AI features for Workspace, including automated PDF summaries in Drive and enhanced capabilities for Google Forms, aimed at improving productivity and information accessibility.
4 Sources
Technology
21 hrs ago
4 Sources
Technology
21 hrs ago