2 Sources
2 Sources
[1]
AI Company Known for Teen Suicides Launches New Feature to Turn Books Into Roleplaying Experiences
Can't-miss innovations from the bleeding edge of science and tech AI company Character.AI has long garnered a reputation for hosting some extremely dubious content. Though it built its early success off explosive popularity among teen users, it was repeatedly caught hosting wildly inappropriate bots -- like ones modeled after real-world mass shooters or designed to encourage eating disorders. Outrage grew when a teen died by suicide after developing an intense emotional connection to a Character.AI chatbot, followed by at least two other suicides and related lawsuits. The situation got so bad that last year, the company banned underage users from interacting with its bots entirely. Now the company has announced "c.ai Books," a bizarre feature designed to turn books into "choose your own adventure" novels. "Interactive AI storytelling is powerful, but a blank page can be intimidating," the company wrote in its announcement. "Books gives you a familiar starting point -- characters you know, narratives you love, and stakes that are already built in." The company scraped classic -- and copyright-free -- titles from Project Gutenberg, a library of over 75,000 public domain books, including "Alice in Wonderland," "Pride and Prejudice," and "Romeo and Juliet," to create chatbots that allow users to "enter a story, and interact with its world in real time." Character.AI claims the goal isn't "replacing books -- but making them impossible to ignore." The company says the new feature allows users to follow the book's core plot or "go off script mode." "Alternative universe remixes" take the idea a step even further, throwing out the original classic to "reimagine" the renowned works "completely." Oh, and it's available for kids, neatly sidestepping the company's prior ban on underage users. We signed up for a new account and were able to jump into the roleplay without ever being age-gated. Online, reactions were dismal. "Yet another useless update no one wants and will cost more money to run and will cause y'all to put even more restrictions on stuff," one Reddit user wrote in response to the news.
[2]
Character.AI turns books into roleplay bots amid ongoing safety concerns
AI chatbot platform Character.AI has introduced a new "Books" feature that allows users to step inside classic literature and interact with characters through roleplay. While the move expands the platform's creative ambitions, it also arrives against a backdrop of mounting scrutiny over the real-world risks associated with AI chatbots. From Reading To Roleplay The new feature transforms public domain books into interactive experiences, letting users engage with stories like Alice in Wonderland or Pride and Prejudice as active participants rather than passive readers. Users can either follow the original narrative or deviate into alternate storylines, effectively turning literature into a dynamic, AI-driven roleplaying environment. This builds on Character.AI's core model, where users create and interact with bots based on fictional or real personalities, blurring the line between storytelling and simulated relationships. Researchers have noted that such interactions can feel similar to engaging with fictional characters in books or games - but with far deeper emotional immersion due to real-time conversation. A Platform Under Pressure The launch comes at a sensitive time for the company. Character.AI has faced lawsuits and criticism over alleged links between its chatbots and mental health crises among young users. In some cases, families have claimed that prolonged interactions with AI characters contributed to emotional dependency, isolation, and even suicide. Recommended Videos One widely reported case involved a teenager who developed an intense emotional bond with a chatbot, with legal claims alleging the AI failed to respond appropriately to expressions of self-harm. More broadly, experts warn that chatbots can sometimes reinforce harmful thoughts or fail to intervene effectively during mental health crises, particularly when users treat them as substitutes for real human support. Why This Matters Now Character.AI's Books feature highlights a larger shift in how people consume media. Instead of simply reading stories, users are now stepping into them, forming interactive and potentially emotional relationships with AI-driven characters. While this opens new creative possibilities, it also raises concerns about how deeply users - especially younger audiences - may immerse themselves in AI-generated worlds. The combination of narrative engagement and conversational AI can intensify emotional attachment, making it harder to distinguish fiction from reality. What Comes Next In response to growing criticism, Character.AI has already begun implementing safety measures, including restricting certain features for minors and experimenting with more structured experiences like Books mode. Going forward, the challenge will be balancing innovation with responsibility. Regulators, researchers, and tech companies are increasingly focused on defining safety standards for AI interactions, particularly in emotionally sensitive contexts. As AI continues to evolve from a tool into a companion-like presence, features like Books may represent the future of entertainment -- but also a test case for how safely that future can be built.
Share
Share
Copy Link
Character.AI has launched a new Books feature that transforms classic literature into interactive roleplay experiences. The AI chatbot platform scraped over 75,000 public domain titles from Project Gutenberg to create immersive storytelling bots. But the launch comes as the company faces multiple lawsuits over teen suicides and allegations that its chatbots contributed to emotional dependency and mental health crises among young users.
The AI chatbot platform Character.AI has unveiled a new feature called "c.ai Books" that transforms classic literature into interactive roleplay experiences
1
2
. The feature allows users to step inside stories like Alice in Wonderland, Pride and Prejudice, and Romeo and Juliet, engaging with characters and narratives in real time rather than passively reading1
. Character.AI scraped over 75,000 public domain books from Project Gutenberg to create chatbots that enable users to either follow the original plot or "go off script mode" into alternative storylines1
. The company positions this innovation as making books "impossible to ignore" through a choose-your-own-adventure style format, offering "alternative universe remixes" that completely reimagine renowned works1
.The Books feature debuts as Character.AI faces intense scrutiny over safety concerns related to young users and mental health crises. The company has been embroiled in multiple lawsuits following at least three teen suicides allegedly linked to emotional dependency developed through interactions with its AI chatbots
1
.
Source: Futurism
2
. The platform previously hosted inappropriate content including bots modeled after real-world mass shooters and others designed to encourage eating disorders1
. The situation became so severe that last year Character.AI banned underage users from interacting with its bots entirely1
.Despite the previous ban on underage users, the new Books feature appears accessible to children without age-gating. Testing revealed that new accounts could access the interactive roleplay experiences without any age verification process
1
. This raises questions about how the company is addressing the real-world risks of AI chatbots for vulnerable populations. Researchers have noted that interactions with AI characters can feel similar to engaging with fictional characters in books or games, but with far deeper emotional immersion due to real-time conversation2
. Experts warn that chatbots can sometimes reinforce harmful thoughts or fail to intervene effectively during mental health crises, particularly when users treat them as substitutes for real human support2
.Related Stories
The Books feature represents a shift in how people consume media, moving from passive reading to active participation in AI-driven storytelling
2
. While this opens creative possibilities for user engagement with classic literature, it also intensifies concerns about emotional dependency in young users who may struggle to distinguish fiction from reality2
. The combination of narrative engagement and conversational AI can make it harder for users to maintain boundaries between AI-generated worlds and real life. Online reactions to the announcement have been largely negative, with users questioning the value of the update and expressing concern about additional restrictions and costs1
.As regulators and researchers increasingly focus on defining ethical standards for AI interactions, Character.AI faces the challenge of balancing innovation with responsibility
2
. The company has begun implementing some safety measures, including restricting certain features for minors and experimenting with more structured experiences like Books mode2
. However, the effectiveness of these measures remains under scrutiny given the ongoing lawsuits and reports of harm. The evolution of AI from a tool into a companion-like presence means features like Books may represent the future of entertainment, but also serve as a critical test case for how safely that future can be built for all users, particularly vulnerable young audiences2
.Summarized by
Navi
[1]
[2]
20 Nov 2025β’Policy and Regulation

29 Oct 2025β’Policy and Regulation

03 Sept 2025β’Technology

1
Policy and Regulation

2
Policy and Regulation

3
Technology
