4 Sources
4 Sources
[1]
A writer is suing Grammarly for turning her and other authors into 'AI editors' without consent | TechCrunch
Grammarly released a controversial feature last week that uses AI to simulate editorial feedback, making it seem like you're getting a critique from novelist Stephen King, the late scientist Carl Sagan, or tech journalist Kara Swisher. But Grammarly did not get permission from the hundreds of experts it included in this feature, called "Expert Review," to use their names. One of the affected writers, journalist Julia Angwin, has filed a class action lawsuit against Superhuman, the parent company that owns Grammarly, arguing that the company violated the privacy and publicity rights of her and the other writers it impersonated. A class action lawsuit allows writers to join Angwin in her case. "I have worked for decades honing my skills as a writer and editor, and I am distressed to discover that a tech company is selling an imposter version of my hard-earned expertise," Angwin said in a statement. The situation is more than a little ironic -- Angwin has spent her career leading investigations into tech companies' impacts on privacy. Other critics of this kind of technology, like renowned AI ethicist Timnit Gebru, were also included in Grammarly's "expert review." The "expert review" feature, available only to subscribers paying $144 a year, predictably fails to deliver on the promise of thoughtful feedback. Casey Newton, the founder and editor of the tech newsletter Platformer and another person impersonated by Grammarly, fed one of his articles into the tool and got feedback from Grammarly's approximation of tech journalist Kara Swisher. Grammarly's imitation of Swisher produced "feedback" so generic that it raises the question of why the company would go through the rigmarole of using these writers' likenesses in the first place. Here is what Grammarly's approximation of Kara Swisher told him: "Could you briefly compare how daily AI users versus AI skeptics articulate risk, creating a through-line readers can follow?" Newton relayed the message from the AI approximation of Kara Swisher to the actual, real human being, Kara Swisher. "You rapacious information and identity thieves better get ready for me to go full McConaughey on you," Swisher texted Newton (referring to Grammarly). "Also, you suck." Grammarly has since disabled the "expert review" feature, according to a LinkedIn post by Superhuman CEO Shishir Mehrotra. While Mehotra offered an apology, he continued to defend the idea of the feature. "Imagine your professor sharpening your essay, your sales leader reshaping a customer pitch, a thoughtful critic challenging your arguments, or a leading expert elevating your proposal," he wrote. "For experts, this is a chance to build that same ubiquitous bond with users, much like Grammarly has."
[2]
Grammarly pulls AI tool mimicking Stephen King and other writers
Writing tool Grammarly has disabled an AI feature which mimicked personas of prominent writers, including Stephen King and scientist Carl Sagan, following a backlash from people impersonated, including a multi-million dollar lawsuit. The Expert Review function, which offered writing feedback "inspired by" the styles of famous authors and academics, was taken down this week by Superhuman, the tech firm which runs Grammarly. The feature was met with resistance from writers who found their names and reputations used as "AI personas" without their consent. Shishir Mehrotra, the firm's chief executive, apologised on LinkedIn, acknowledging the tool had "misrepresented" the voices of experts. Julia Angwin, an investigative journalist whose persona was one of those used in the feature, has filed a class-action federal lawsuit in the US against Superhuman and Grammarly. Writing on social media, Angwin said: "I'm suing Grammarly over its paid AI feature that presented editing suggestions as if they came from me - and many other writers and journalists - without consent." According to legal filings cited by Wired, the action was launched on Wednesday in the Southern District of New York. It states Angwin, on behalf of herself and others in a similar position, "challenges Grammarly's misappropriation of the names and identities of hundreds of journalists, authors, writers, and editors to earn profits for Grammarly and its owner, Superhuman". The lawsuit argues it is "unlawful to appropriate peoples' names and identities for commercial purposes," and seeks to stop the platform from attributing advice to experts that they "never gave". The damages sought exceed $5m (Β£3.7m), Wired reports. Grammarly was founded in 2009 as a writing-review tool and began integrating a suite of generative-AI tools in August 2025. Part of this was the Expert Review function which appears to have launched without the named famous personas introduced later. Although the company began rebranding to Superhuman in October, Grammarly was kept as the name of its main service. As criticism mounted in recent days, Superhuman initially said it would maintain the feature but allow those named to "opt-out", according to The Verge. Wes Fenlon, a gaming journalist whose persona was used in the tool, wrote on BlueSky: "Opt-out via email is a laughably inadequate recourse for selling a product that verges on impersonation and profits on unearned credibility." Mehrotra said in response to the backlash: "Over the past week, we received valid critical feedback from experts who are concerned that the agent misrepresented their voices. "This kind of scrutiny improves our products, and we take it seriously." He said the AI agent had drawn on "publicly available information from third-party LLMs to surface writing suggestions inspired by the published work of influential voices". The firm's chief executive apologised, adding: "We hear the feedback and recognize we fell short on this." Sign up for our Tech Decoded newsletter to follow the world's top tech stories and trends. Outside the UK? Sign up here.
[3]
Grammarly removes AI feature which used real authors' identities, faces class action lawsuit
Credit: Thomas Fuller / SOPA Images / LightRocket via Getty Images Grammarly has pulled its AI-powered Expert Review feature after being called out for using journalists' and authors' identities without permission. The writing assistant software is now facing a class action lawsuit accusing it of exploiting writers' names for its own profit. Launched alongside seven other AI agents last August, Expert Review was available on Grammarly's Free and $12 Pro plans at launch, and was promoted as providing users with feedback on the content of their writing. A page on Grammarly's website which has since been taken down stated that Expert Review "[drew] on insights from subject-matter experts and trusted publications," and provided AI-generated feedback "based on publicly available expert content" (via Wayback Machine). Users could even personalise which "expert" sources Grammarly drew from by selecting the names of specific authors. "Expert Review agent offers subject-matter expertise and personalized, topic-specific feedback to elevate writing that meets rigorous academic or professional standards tailored to the user's field," Grammarly wrote in its blog post announcing the feature. Grammarly's Expert Review came to attention last week after Wired reported that the feature was offering AI-generated edits in the name of real writers and academics, both living and dead. The tool's user guide does provide the disclaimer that its references to experts "are for informational purposes only and do not indicate any affiliation with Grammarly or endorsement by those individuals or entities." However, the same page also claims that Expert Review offers "insights from leading professionals, authors, and subject-matter experts." Many said subject-matter experts have not taken kindly to Grammarly using their identities without their knowledge or consent. "[Grammarly] curated a list of real people, gave its models free rein to hallucinate plausible-sounding advice on their behalf, and put it all behind a subscription," wrote Platformer founder Casey Newton, who was among those invoked by Grammarly. That's a deliberate choice to monetize the identities of real people without involving them, and it sucks." "This has got to be some kind of defamation or something," historian Mar Hicks posted to Bluesky, having shared a screenshot of their identity being included in Expert Review. "You can't just steal people's IP and then pretend they're saying something they never said." Responding to the backlash, Grammarly told Platformer on Monday that it would allow writers to email them to opt out of inclusion in its Expert Review feature. This prompted further criticism, as experts were not told that Grammarly was using their identity, nor had they granted it permission in the first place. Impacted authors wouldn't know that they needed to opt out unless a Grammarly user saw their name while using Expert Review and informed them. Further, providing the option to opt out did not address Grammarly's use of dead authors' identities. Deceased writers used by Expert Review reportedly included astronomer Carl Sagan and intersectional academic bell hooks. "So Grammerly [sic] is violating the memory of bell hooks AND making AI versions of the rest of us before we're even dead," wrote researcher Sarah J. Jackson. "Someone tell me who to sue, not even joking." Shishir Mehrotra, CEO of Grammarly developer Superhuman, subsequently announced on Wednesday that it was pulling Expert Review offline. However, he also indicated that the company intends to eventually bring it back in some form. "Over the past week, we received valid critical feedback from experts who are concerned that the agent misrepresented their voices," Mehrotra posted to LinkedIn. "As context, the agent was designed to help users discover influential perspectives and scholarship relevant to their work, while also providing meaningful ways for experts to build deeper relationships with their fans. We hear the feedback and recognize we fell short on this. I want to apologize and acknowledge that we'll rethink our approach going forward. "After careful consideration, we have decided to disable Expert Review while we reimagine the feature to make it more useful for users, while giving experts real control over how they want to be represented -- or not represented at all." "That this even existed in the first place suggests a total disconnect from normal human society," climate writer Ketan Joshi replied to Mehrotra's post. "It should've been immediately obvious that this was exploitative and creepy and cruel." "With all the talk about how AI 'builds from" (read: 'steals') existent content, creating a tool that actually makes up 'advice' from real people who spend their lives caring about writing and expertise... it's hard to fathom," wrote the New York Times' Dan Saltzstein. "There should be consequences to this beyond 'we're going to reevaluate.' A promise to never do anything like this again, at minimum." Though Grammarly has made no such pledge at present, it is already facing repercussions for its actions that go beyond reputational damage. New York Times writer Julia Angwin filed a class action lawsuit against Superhuman on Wednesday, having discovered that Grammarly's Expert Review had used her identity without her consent. The law firm representing her, Peter Romer-Friedman Law PLLC, has put out a call for any writers who were impacted to join the class action. Though it isn't clear exactly how many writers' identities Grammarly allegedly misappropriated, it may be a sizable cohort. Looking at tech journalists alone, The Verge reports that Expert Review named several members of its editorial staff, as well as writers from Wired, Bloomberg, The New York Times, The Atlantic, PC Gamer, Gizmodo, Digital Foundry, Tom's Guide, and Mashable's sister sites IGN and Rock Paper Shotgun. Angwin has claimed that "lots of folks" have already made inquiries about joining the lawsuit. "I'm taking this action on behalf of not just myself, but everyone who spent years and decades refining their skills as a writer and editor, only to find an AI impersonating them," Angwin wrote in a LinkedIn post. "For over 100 years, New York law has prohibited companies from using a person's name for commercial purposes without their consent," said Peter Romer-Friedman of Peter Romer-Friedman Law PLLC. "The law does not provide an exception for technology companies or AI." Filed in a New York District Court, the class action is seeking damages as well as an injunction to prevent Grammarly from using writers' identities without their consent. Mashable has reached out to Superhuman for comment.
[4]
Grammarly Shuts Down 'Expert Review' AI After Backlash Over Fake Author Feedback
Grammarly Pulls AI Tool That Mimicked Feedback From Real Writers After Public Criticism Grammarly has disabled its 'Expert Review' AI feature after receiving massive backlash from writers and journalists who said the tool generated feedback that appeared to come from real authors without their consent. The feature provided users with writing advice as if it were coming from well-known experts, quickly raising concerns about misrepresentation and identity misuse. The company confirmed that the feature has been taken down after facing growing backlash online. Critics argued that the feature could mislead users by making AI-generated feedback appear to have been written by real professionals. it is reviewing the feature's design and considering changes to ensure greater transparency in the future.
Share
Share
Copy Link
Grammarly pulled its Expert Review feature after widespread AI backlash from writers whose identities were used without permission. Journalist Julia Angwin filed a class action lawsuit seeking over $5 million in damages, arguing the company violated publicity rights by selling fake author feedback. The tool simulated editorial advice from Stephen King, Carl Sagan, and hundreds of others.
Grammarly has disabled its controversial Expert Review feature following intense criticism and legal action from writers whose identities were used without permission
1
. The AI tool mimicking writers promised to deliver editorial feedback as if it came from renowned figures like novelist Stephen King, the late scientist Carl Sagan, or tech journalist Kara Swisher. But the company never obtained consent from the hundreds of experts it included in the feature1
.
Source: BBC
Journalist Julia Angwin has filed a class action lawsuit against Superhuman, the parent company that owns Grammarly, challenging what she describes as the misappropriation of names and identities of hundreds of journalists, authors, writers, and editors for commercial purposes
2
. The lawsuit, launched in the Southern District of New York, seeks damages exceeding $5 million and aims to stop the platform from attributing advice to experts that they never gave2
."I have worked for decades honing my skills as a writer and editor, and I am distressed to discover that a tech company is selling an imposter version of my hard-earned expertise," Angwin said in a statement
1
. The situation carries particular irony as Angwin has spent her career investigating tech companies' impacts on privacy. Other critics of this technology, including renowned AI ethicist Timnit Gebru, were also included in the feature1
.
Source: TechCrunch
The Expert Review function, available only to subscribers paying $144 a year, drew on "publicly available information from third-party LLMs to surface writing suggestions inspired by the published work of influential voices," according to Shishir Mehrotra, Superhuman CEO
2
. Launched in August 2025 as part of eight AI agents, the feature was promoted on Grammarly's Free and $12 Pro plans3
.The feature failed to deliver on its promise of thoughtful feedback. Casey Newton, founder and editor of tech newsletter Platformer and another person impersonated by Grammarly, tested the tool and received feedback so generic it raised questions about why the company would use these writers' likenesses at all
1
. When Newton shared the AI-generated feedback with the real Kara Swisher, she responded: "You rapacious information and identity thieves better get ready for me to go full McConaughey on you. Also, you suck"1
."[Grammarly] curated a list of real people, gave its models free rein to hallucinate plausible-sounding advice on their behalf, and put it all behind a subscription," Newton wrote. "That's a deliberate choice to monetize the identities of real people without involving them, and it sucks"
3
.Related Stories
As criticism mounted, Superhuman initially said it would maintain the feature but allow those named to opt out via email
2
. This response prompted further backlash, as impacted authors weren't informed that Grammarly was using their identity and hadn't granted permission in the first place. Gaming journalist Wes Fenlon, whose persona was used in the tool, wrote: "Opt-out via email is a laughably inadequate recourse for selling a product that verges on impersonation and profits on unearned credibility"2
.The opt-out approach also failed to address misrepresentation of deceased authors, including astronomer Carl Sagan and intersectional academic bell hooks
3
. Researcher Sarah J. Jackson wrote: "So Grammerly [sic] is violating the memory of bell hooks AND making AI versions of the rest of us before we're even dead. Someone tell me who to sue, not even joking"3
.Mehrotra announced Wednesday that the company was disabling Expert Review while it reimagines the feature "to make it more useful for users, while giving experts real control over how they want to be represented -- or not represented at all"
3
. He apologized and acknowledged the agent had "misrepresented" the voices of experts2
. However, his statement indicated that Superhuman intends to eventually bring it back in some form, raising questions about what safeguards will be implemented3
.Grammarly was founded in 2009 as a writing-review tool and began integrating generative AI tools in August 2025
2
. The company began rebranding to Superhuman in October, though Grammarly was kept as the name of its main service2
. The lawsuit argues it is unlawful to appropriate peoples' names and identities for commercial purposes without consent2
, setting a potential precedent for how AI companies can use public figures' identities in their products.
Source: Analytics Insight
Summarized by
Navi
[1]
[3]
[4]
05 Mar 2026β’Policy and Regulation

18 Aug 2025β’Technology

23 Dec 2025β’Policy and Regulation

1
Technology

2
Technology

3
Policy and Regulation
