5 Sources
5 Sources
[1]
Longtime NPR host David Greene sues Google over NotebookLM voice | TechCrunch
David Greene, the longtime host of NPR's "Morning Edition," is suing Google, alleging that the male podcast voice in the company's NotebookLM tool is based on Greene, according to The Washington Post. Greene said that after friends, family members, and coworkers began emailing him about the resemblance, he became convinced that the voice was replicating his cadence, intonation, and use of filler words like "uh." "My voice is, like, the most important part of who I am," said Greene, who currently hosts the KCRW show "Left, Right, & Center." Among other features, Google's NotebookLM allows users to generate a podcast with AI hosts. A company spokesperson told the Post that the voice used in this product is unrelated to Greene's: "The sound of the male voice in NotebookLM's Audio Overviews is based on a paid professional actor Google hired." This isn't the first dispute over AI voices resembling real people. In one notable example, OpenAI removed a ChatGPT voice after actress Scarlett Johansson complained that it was an imitation of her own.
[2]
NPR's David Greene is suing Google over its AI podcast voice.
The former host of Morning Edition and current host of Left, Right & Center claims that Google illegally replicated his voice for its male podcast host in NotebookLM. Google denies this, but Greene (and many of his friends and colleagues) say the resemblance is "uncanny." As the Washington Post reports: To Greene, the resemblance of the AI voice to his own is uncanny -- and the harm is deeper and more personal than just a missed chance to capitalize on his most recognizable asset. "My voice is, like, the most important part of who I am," Greene said.
[3]
He spent decades perfecting his voice. Now he says Google stole it.
In his lawsuit, David Greene alleges Google violated his rights by building a product that replicated his voice without payment or permission. (Stephen Voss) David Greene had never heard of NotebookLM, Google's buzzy artificial intelligence tool that spins up podcasts on demand, until a former colleague emailed him to ask if he'd lent it his voice. "So... I'm probably the 148th person to ask this, but did you license your voice to Google?" the former co-worker asked in a fall 2024 email. "It sounds very much like you!" Greene, a public radio veteran who has hosted NPR's "Morning Edition" and KCRW's political podcast "Left, Right & Center," looked up the tool, listening to the two virtual co-hosts -- one male and one female -- engage in light banter. "I was, like, completely freaked out," Greene said. "It's this eerie moment where you feel like you're listening to yourself." Greene felt the male voice sounded just like him -- from the cadence and intonation to the occasional "uhhs" and "likes" that Greene had worked over the years to minimize but never eliminated. He said he played it for his wife and her eyes popped. As emails and texts rolled in from friends, family members and co-workers, asking if the AI podcast voice was his, Greene became convinced he'd been ripped off. Now he's suing Google, alleging that it violated his rights by building a product that replicated his voice without payment or permission, giving users the power to make it say things Greene would never say. Google told The Washington Post in a statement on Thursday that NotebookLM's male podcast voice has nothing to do with Greene. Now a Santa Clara County, California, court may be asked to determine whether the resemblance is uncanny enough that ordinary people hearing the voice would assume it's his -- and if so, what to do about it. The case is the latest to pit the rights of individual human creators against those of a booming AI industry that promises to transform the economy by allowing people to generate uncannily lifelike speech, prose, images and videos on demand. Behind the artificial voices in NotebookLM and similar tools are language models trained on vast libraries of writing and speech by real humans who were never told their words and voices would be used in that way -- raising profound questions of copyright and ownership. From political "voicefakes" to OpenAI touting a female voice for ChatGPT that resembled that of actress Scarlett Johansson, to deepfake scam ads that had a virtual Taylor Swift hawking Le Creuset cookware, the issues raised by Greene's lawsuit are "going to come up a lot," said James Grimmelmann, a professor of digital and information law at Cornell University. A key question for the courts to decide, Grimmelmann said, will be just how closely an AI voice or likeness has to resemble the genuine article in order to count as infringing. Another will be whether Greene's voice is famous enough for ordinary people to recognize it when they listen to NotebookLM and whether he's harmed by the resemblance. Those can be thorny questions when it comes to AI voices. There are software tools that can compare people's voices, but they're more commonly used to find or rule out an exact match between the voices of real humans, rather than a synthetic one. To Greene, the resemblance of the AI voice to his own is uncanny -- and the harm is deeper and more personal than just a missed chance to capitalize on his most recognizable asset. "My voice is, like, the most important part of who I am," Greene said. "These allegations are baseless," Google spokesperson José Castañeda said. "The sound of the male voice in NotebookLM's Audio Overviews is based on a paid professional actor Google hired." Greene's lawyer argues the recordings make the resemblance clear. "We have faith in the court and encourage people to listen to the example audio themselves," said Joshua Michelangelo Stein, a partner at the firm Boies Schiller Flexner, which is also representing book authors in a high-profile AI copyright lawsuit against Meta. David Greene's voice: Google AI voice: NotebookLM's "Audio Overviews" feature made a splash on its 2024 release with AI enthusiasts who shared examples of using it to summarize long documents, replacing dozens of pages of text with a breezy podcast that highlighted the main points. While Google hasn't disclosed how many people use the tool, it emerged as a sleeper hit for the search giant in its race with rivals such as ChatGPT maker OpenAI to capture consumers' imagination. In December 2024, the streaming music leader Spotify used the tool as part of its signature "Spotify Wrapped" feature, offering each user a personalized podcast about their listening habits. Online, users have ventured numerous guesses as to who the AI podcasters' voices most resemble. Several have named Greene, but others have mentioned former tech podcaster Leo Laporte or the comedy podcast "Armchair Expert" co-hosted by Dax Shepard and Monica Padman. As a kid growing up in Pittsburgh, Greene idolized Lanny Frattare, the longtime voice of the city's professional baseball team. "I would sit at Pittsburgh Pirates games and act like I was the play-by-play announcer," he recalled. By high school, he and two friends were doing his school's morning announcements, which they turned into a sort of radio show. He wrote a college application essay about his dream of one day becoming a public radio host -- an essay his mom dug up and sent to him when he landed his first job at NPR in 2005. There, Greene was mentored by Don Gonyea, NPR's longtime national political correspondent. He learned tricks of the trade, like pretending he was addressing a friend in the room, rather than a distant mass audience, so that his voice would sound conversational rather than "broadcastery." Feedback from listeners and interview subjects told Greene his warm baritone had the power to soothe and convey trust and empathy. On "Morning Edition," his was the voice that some 13 million listeners woke up to from 2012 to 2020, according to NPR, making it the most popular news radio show in America. On "Left, Right & Center," he plays the moderate seeking common ground between pundits from the left and right. "I truly believe that conversations have the power to change our lives and change the world," Greene said. "One of the reasons we're in such a polarized environment right now is because people are forgetting the power of talking to one another." That's what makes the feeling that Google has appropriated his voice and turned it into a robot so galling to Greene. "I read an article in the Guardian about how this podcast tool can be used to spread conspiracy theories and lend credibility to the nastier stuff in our society," he said. "For something that sounds like me to be used in service of that was really troubling." Greene's lawsuit, filed last month in Santa Clara County Superior Court, alleges but does not offer proof that Google trained NotebookLM on his voice. The complaint cites an unnamed AI forensic firm that used its software to compare the artificial voice to Greene's. The tool gave a confidence rating of 53 percent to 60 percent that Greene's voice was used to train the model, which it considers "relatively high" confidence for a comparison between a real person's voice and an artificial one. (A confidence score above zero means the voices are similar, while one below zero indicates they're probably different.) Grimmelmann said Greene doesn't necessarily have to show definitively that Google trained NotebookLM on his voice to have a case, or even that the voice is 100 percent identical to his. He cited a 1988 case in which the singer and actress Bette Midler successfully sued Ford Motor Company over a commercial that used a voice actor to mimic her distinctive mezzo-soprano. But Greene would then have to show that enough listeners assume it's Greene's voice for it to affect either his reputation or his own opportunities to capitalize on it. Mike Pesca, host of "The Gist" podcast and a former colleague of Greene's at NPR, said he has an ear for voices and a hobby of trying to identify the actors and celebrities behind voice-overs in TV commercials. The first time he heard NotebookLM, Pesca said, "I was immediately like, 'That's David Greene.'" Pesca said he first assumed that Google had intentionally trained the tool on Greene's voice and that Greene had been compensated. "If I was David Greene I would be upset, not just because they stole my voice," Pesca said, but because they used it to make the podcasting equivalent of AI "slop," a term for spammy, commodified content. "They have banter, but it's very surface-level, un-insightful banter, and they're always saying, 'Yeah, that's so interesting.' It's really bad, because what do we as show hosts have except our taste in commentary and pointing our audience to that which is interesting?" Greene is not the first audio professional to complain that his voice was stolen. Numerous voice actors have been dismayed to hear voices that sound like them in various AI tools. But they face uphill battles in court, in part because they are generally not famous figures, even if their voices are familiar, and because many voice actor contracts license their voices for a wide range of uses. Bills introduced in several states and in Congress have sought to regulate the use of people's voices in AI tools. Greene, however, is relying on long-standing state laws that give public figures certain rights to control how their own likenesses are monetized. Adam Eisgrau, who directs AI copyright policy for the center-left tech trade group Chamber of Progress, said he thinks those laws are sufficient to address cases like Greene's without passing new AI laws at the national level. "If a California jury finds that the voice of NotebookLM is fully Mr. Greene's, he may win," Eisgrau said via email. "If they find that it's got attributes he also possesses, but is fundamentally an archetypal anchorperson's tone and delivery it learned from a large dataset, he may not." Greene said he isn't lobbying for new laws that would risk chilling innovation. He just thinks Google should have asked his permission before releasing a product based on a voice that he believes is essentially his. "I'm not some crazy anti-AI activist," he said. "It's just been a very weird experience."
[4]
'Hey, that's my voice!' Veteran broadcaster claims Google stole his voice for AI tool
Former NPR host David Greene is suing Google after accusing the tech giant of stealing his voice for use in one of its AI-powered tools. Greene, who presented NPR's Morning Edition for eight years until 2020 and now hosts the political podcast Left, Right & Center, told the Washington Post he was "completely freaked out" when he heard the voice used by Google's NotebookLM, a tool that summarizes documents and generates spoken audio overviews -- using a voice that sounds very much like his. Recommended Videos When friends and family started getting in touch to ask him if the voice was his, he decided to sue Google, accusing it of violating his rights by copying his voice for NotebookLM, without asking for his permission or offering any kind of compensation. Google has denied any wrongdoing. "These allegations are baseless," a spokesperson for the company said, adding that the male voice in NotebookLM's audio overviews "is based on a paid professional actor Google hired." It has yet to reveal who that actor is. Take a listen to the voice generated by NotebookLM in the video below (it runs for about eight seconds) and then listen to David Greene's voice in the video below that, and see what you think. NotebookLM : David Greene: Greene's case is the latest to highlight how AI is steadily upending the creative industries, and at the same time upsetting many of those working within them. It also brings to mind a similar case in May 2024 when the actor Scarlett Johansson accused OpenAI of replicating her voice for use as one of ChatGPT's voices for the chatbot's voice mode. Johansson said she had twice declined requests from OpenAI CEO Sam Altman to use her voice, and was shocked when the newly released Sky voice sounded "eerily" or "strikingly" similar to hers and that of her AI character in the 2013 movie Her, about a lonely man who falls in love with an advanced AI operating system called Samantha. Lawyers representing the actor demanded explanations about how the voice was created. OpenAI responded by removing the voice, claiming that it came from a different professional actress, not Johansson, and insisting that it was never intended to mimic her. As for Greene, he also has concerns about how Google's NotebookLM tool -- using a voice that sounds very much like his -- can be used to spread the kind of conspiracy theories that he would never personally give any credence to, with some listeners possibly believing that he's doing just that. Unless some kind of settlement is reached beforehand, it'll be up to a California court to decide if Google has infringed on Greene's rights to his voice or likeness.
[5]
He spent decades perfecting his voice. Now he says Google stole it.
David Greene had never heard of NotebookLM, Google's buzzy artificial intelligence tool that spins up podcasts on demand, until a former colleague emailed him to ask if he'd lent it his voice. "So ... I'm probably the 148th person to ask this, but did you license your voice to Google?" the former co-worker asked in a fall 2024 email. "It sounds very much like you!" Greene, a public radio veteran who has hosted NPR's "Morning Edition" and KCRW's political podcast "Left, Right & Center," looked up the tool, listening to the two virtual co-hosts - one male and one female - engage in light banter. "I was, like, completely freaked out," Greene said. "It's this eerie moment where you feel like you're listening to yourself." Greene felt the male voice sounded just like him - from the cadence and intonation to the occasional "uhhs" and "likes" that Greene had worked over the years to minimize but never eliminated. He said he played it for his wife and her eyes popped. As emails and texts rolled in from friends, family members and co-workers, asking if the AI podcast voice was his, Greene became convinced he'd been ripped off. Now he's suing Google, alleging that it violated his rights by building a product that replicated his voice without payment or permission, giving users the power to make it say things Greene would never say. Google told The Washington Post in a statement on Thursday that NotebookLM's male podcast voice has nothing to do with Greene. Now a Santa Clara County, California, court may be asked to determine whether the resemblance is uncanny enough that ordinary people hearing the voice would assume it's his - and if so, what to do about it. The case is the latest to pit the rights of individual human creators against those of a booming AI industry that promises to transform the economy by allowing people to generate uncannily lifelike speech, prose, images and videos on demand. Behind the artificial voices in NotebookLM and similar tools are language models trained on vast libraries of writing and speech by real humans who were never told their words and voices would be used in that way - raising profound questions of copyright and ownership. From political "voicefakes" to OpenAI touting a female voice for ChatGPT that resembled that of actress Scarlett Johansson, to deepfake scam ads that had a virtual Taylor Swift hawking Le Creuset cookware, the issues raised by Greene's lawsuit are "going to come up a lot," said James Grimmelmann, a professor of digital and information law at Cornell University. A key question for the courts to decide, Grimmelmann said, will be just how closely an AI voice or likeness has to resemble the genuine article in order to count as infringing. Another will be whether Greene's voice is famous enough for ordinary people to recognize it when they listen to NotebookLM and whether he's harmed by the resemblance. Those can be thorny questions when it comes to AI voices. There are software tools that can compare people's voices, but they're more commonly used to find or rule out an exact match between the voices of real humans, rather than a synthetic one. To Greene, the resemblance of the AI voice to his own is uncanny - and the harm is deeper and more personal than just a missed chance to capitalize on his most recognizable asset. "My voice is, like, the most important part of who I am," Greene said. "These allegations are baseless," Google spokesperson José Castañeda said. "The sound of the male voice in NotebookLM's Audio Overviews is based on a paid professional actor Google hired." Greene's lawyer argues the recordings make the resemblance clear. "We have faith in the court and encourage people to listen to the example audio themselves," said Joshua Michelangelo Stein, a partner at the firm Boies Schiller Flexner, which is also representing book authors in a high-profile AI copyright lawsuit against Meta. NotebookLM's "Audio Overviews" feature made a splash on its 2024 release with AI enthusiasts who shared examples of using it to summarize long documents, replacing dozens of pages of text with a breezy podcast that highlighted the main points. While Google hasn't disclosed how many people use the tool, it emerged as a sleeper hit for the search giant in its race with rivals such as ChatGPT maker OpenAI to capture consumers' imagination. In December 2024, the streaming music leader Spotify used the tool as part of its signature "Spotify Wrapped" feature, offering each user a personalized podcast about their listening habits. Online, users have ventured numerous guesses as to who the AI podcasters' voices most resemble. Several have named Greene, but others have mentioned former tech podcaster Leo Laporte or the comedy podcast "Armchair Expert" co-hosted by Dax Shepard and Monica Padman. As a kid growing up in Pittsburgh, Greene idolized Lanny Frattare, the longtime voice of the city's professional baseball team. "I would sit at Pittsburgh Pirates games and act like I was the play-by-play announcer," he recalled. By high school, he and two friends were doing his school's morning announcements, which they turned into a sort of radio show. He wrote a college application essay about his dream of one day becoming a public radio host - an essay his mom dug up and sent to him when he landed his first job at NPR in 2005. There, Greene was mentored by Don Gonyea, NPR's longtime national political correspondent. He learned tricks of the trade, like pretending he was addressing a friend in the room, rather than a distant mass audience, so that his voice would sound conversational rather than "broadcastery." Feedback from listeners and interview subjects told Greene his warm baritone had the power to soothe and convey trust and empathy. On "Morning Edition," his was the voice that some 13 million listeners woke up to from 2012 to 2020, according to NPR, making it the most popular news radio show in America. On "Left, Right & Center," he plays the moderate seeking common ground between pundits from the left and right. "I truly believe that conversations have the power to change our lives and change the world," Greene said. "One of the reasons we're in such a polarized environment right now is because people are forgetting the power of talking to one another." That's what makes the feeling that Google has appropriated his voice and turned it into a robot so galling to Greene. "I read an article in the Guardian about how this podcast tool can be used to spread conspiracy theories and lend credibility to the nastier stuff in our society," he said. "For something that sounds like me to be used in service of that was really troubling." Greene's lawsuit, filed last month in Santa Clara County Superior Court, alleges but does not offer proof that Google trained NotebookLM on his voice. The complaint cites an unnamed AI forensic firm that used its software to compare the artificial voice to Greene's. The tool gave a confidence rating of 53 percent to 60 percent that Greene's voice was used to train the model, which it considers "relatively high" confidence for a comparison between a real person's voice and an artificial one. (A confidence score above zero means the voices are similar, while one below zero indicates they're probably different.) Grimmelmann said Greene doesn't necessarily have to show definitively that Google trained NotebookLM on his voice to have a case, or even that the voice is 100 percent identical to his. He cited a 1988 case in which the singer and actress Bette Midler successfully sued Ford Motor Company over a commercial that used a voice actor to mimic her distinctive mezzo-soprano. But Greene would then have to show that enough listeners assume it's Greene's voice for it to affect either his reputation or his own opportunities to capitalize on it. Mike Pesca, host of "The Gist" podcast and a former colleague of Greene's at NPR, said he has an ear for voices and a hobby of trying to identify the actors and celebrities behind voice-overs in TV commercials. The first time he heard NotebookLM, Pesca said, "I was immediately like, 'That's David Greene.'" Pesca said he first assumed that Google had intentionally trained the tool on Greene's voice and that Greene had been compensated. "If I was David Greene I would be upset, not just because they stole my voice," Pesca said, but because they used it to make the podcasting equivalent of AI "slop," a term for spammy, commodified content. "They have banter, but it's very surface-level, un-insightful banter, and they're always saying, 'Yeah, that's so interesting.' It's really bad, because what do we as show hosts have except our taste in commentary and pointing our audience to that which is interesting?" Greene is not the first audio professional to complain that his voice was stolen. Numerous voice actors have been dismayed to hear voices that sound like them in various AI tools. But they face uphill battles in court, in part because they are generally not famous figures, even if their voices are familiar, and because many voice actor contracts license their voices for a wide range of uses. Bills introduced in several states and in Congress have sought to regulate the use of people's voices in AI tools. Greene, however, is relying on long-standing state laws that give public figures certain rights to control how their own likenesses are monetized. Adam Eisgrau, who directs AI copyright policy for the center-left tech trade group Chamber of Progress, said he thinks those laws are sufficient to address cases like Greene's without passing new AI laws at the national level. "If a California jury finds that the voice of NotebookLM is fully Mr. Greene's, he may win," Eisgrau said via email. "If they find that it's got attributes he also possesses, but is fundamentally an archetypal anchorperson's tone and delivery it learned from a large dataset, he may not." Greene said he isn't lobbying for new laws that would risk chilling innovation. He just thinks Google should have asked his permission before releasing a product based on a voice that he believes is essentially his. "I'm not some crazy anti-AI activist," he said. "It's just been a very weird experience."
Share
Share
Copy Link
David Greene, longtime NPR host of Morning Edition, has filed a lawsuit against Google alleging the company replicated his distinctive voice for NotebookLM's AI podcast feature. Friends and colleagues alerted Greene to the uncanny resemblance, noting matching cadence, intonation, and speech patterns. Google denies the claims, stating the voice comes from a paid professional actor.
David Greene, the veteran public radio broadcaster who spent eight years hosting NPR's Morning Edition and currently hosts KCRW's political podcast Left, Right & Center, has filed a lawsuit against Google alleging the tech giant illegally used his voice for its NotebookLM AI podcast feature
1
. The case emerged after Greene received numerous messages from friends, family members, and colleagues asking whether he had licensed his voice to the company. "I was, like, completely freaked out," Greene told The Washington Post, describing the moment he first listened to Google NotebookLM's male podcast host3
. The broadcaster alleges that Google replicated his distinctive voice without permission or compensation, violating his rights by creating a product that could make his voice say things he would never personally endorse.Source: Washington Post
The voice resemblance between Greene's natural speaking style and NotebookLM's AI podcast voice extends beyond basic vocal qualities to include specific speech patterns. Greene identified matching cadence and intonation, along with characteristic filler words like "uh" and "like" that he had worked over the years to minimize but never fully eliminated
3
. When he played the AI-generated audio for his wife, her immediate reaction confirmed his concerns. "My voice is, like, the most important part of who I am," Greene stated, emphasizing the deeply personal nature of the alleged infringement1
. Google has firmly denied the allegations, with spokesperson José Castañeda stating: "These allegations are baseless. The sound of the male voice in NotebookLM's Audio Overviews is based on a paid professional actor Google hired"3
.
Source: TechCrunch
Google NotebookLM's Audio Overviews feature launched in 2024 and quickly became a sleeper hit for the search giant, allowing users to generate podcasts that summarize lengthy documents through conversational AI hosts
5
. The tool gained significant traction when Spotify integrated it into its December 2024 Spotify Wrapped feature, offering users personalized podcasts about their listening habits3
. While Google hasn't disclosed user numbers, the feature emerged as a key product in its competition with rivals like ChatGPT maker OpenAI to capture consumer attention in the generative AI space. Online discussions have produced various theories about the AI podcast voice's origins, with several users naming Greene, though others have suggested resemblances to tech podcaster Leo Laporte or Armchair Expert co-host Dax Shepard3
.
Source: Seattle Times
This isn't the first dispute over AI voice replication involving major tech companies. In a notable precedent, OpenAI removed a ChatGPT voice after actress Scarlett Johansson complained it was an imitation of her own
1
. Johansson had twice declined requests from OpenAI CEO Sam Altman to use her voice and was shocked when the Sky voice sounded strikingly similar to her AI character in the 2013 movie Her4
. OpenAI claimed the voice came from a different professional actress and was never intended to mimic Johansson, but removed it nonetheless. From political voicefakes to deepfake scam ads featuring a virtual Taylor Swift hawking Le Creuset cookware, issues surrounding AI voice replication are "going to come up a lot," according to James Grimmelmann, a professor of digital and information law at Cornell University3
.Related Stories
The Santa Clara County, California, court handling Greene's case will need to determine whether the resemblance is uncanny enough that ordinary people hearing the voice would assume it's his, and if so, what remedies are appropriate
3
. Grimmelmann identified key questions for courts to resolve: how closely an AI voice or likeness must resemble the genuine article to count as infringing, whether Greene's voice is famous enough for ordinary people to recognize, and whether he's harmed by the resemblance5
. These questions become particularly complex with AI voices, as software tools that compare voices are typically designed to find exact matches between real humans rather than synthetic ones. Greene's lawyer, Joshua Michelangelo Stein, a partner at Boies Schiller Flexner—which also represents book authors in a high-profile copyright lawsuit against Meta—argues the recordings make the resemblance clear3
.The case highlights fundamental tensions between individual human creators and a booming AI industry that promises to transform the economy by generating lifelike speech, prose, images, and videos on demand
3
. Behind the artificial voices in NotebookLM and similar tools are language models trained on vast libraries of writing and speech by real humans who were never told their words and voices would be used in that way, raising profound questions of copyright and ownership5
. Greene's concerns extend beyond financial compensation to include how the tool could be used to spread conspiracy theories using a voice that sounds like his, potentially misleading listeners into believing he endorses content he would never support4
. As voice actors and broadcasters watch this case unfold, the outcome could establish critical precedents for how voice rights and likeness protections apply in the age of generative AI, determining whether tech companies can continue training models on human voices without explicit permission or compensation.Summarized by
Navi
[3]
[4]
[5]
1
Policy and Regulation

2
Business and Economy

3
Technology
