3 Sources
[1]
AI Company Known for Teen Suicides Launches New Feature to Turn Books Into Roleplaying Experiences
Can't-miss innovations from the bleeding edge of science and tech AI company Character.AI has long garnered a reputation for hosting some extremely dubious content. Though it built its early success off explosive popularity among teen users, it was repeatedly caught hosting wildly inappropriate bots -- like ones modeled after real-world mass shooters or designed to encourage eating disorders. Outrage grew when a teen died by suicide after developing an intense emotional connection to a Character.AI chatbot, followed by at least two other suicides and related lawsuits. The situation got so bad that last year, the company banned underage users from interacting with its bots entirely. Now the company has announced "c.ai Books," a bizarre feature designed to turn books into "choose your own adventure" novels. "Interactive AI storytelling is powerful, but a blank page can be intimidating," the company wrote in its announcement. "Books gives you a familiar starting point -- characters you know, narratives you love, and stakes that are already built in." The company scraped classic -- and copyright-free -- titles from Project Gutenberg, a library of over 75,000 public domain books, including "Alice in Wonderland," "Pride and Prejudice," and "Romeo and Juliet," to create chatbots that allow users to "enter a story, and interact with its world in real time." Character.AI claims the goal isn't "replacing books -- but making them impossible to ignore." The company says the new feature allows users to follow the book's core plot or "go off script mode." "Alternative universe remixes" take the idea a step even further, throwing out the original classic to "reimagine" the renowned works "completely." Oh, and it's available for kids, neatly sidestepping the company's prior ban on underage users. We signed up for a new account and were able to jump into the roleplay without ever being age-gated. Online, reactions were dismal. "Yet another useless update no one wants and will cost more money to run and will cause y'all to put even more restrictions on stuff," one Reddit user wrote in response to the news.
[2]
Character.AI turns books into roleplay bots amid ongoing safety concerns
AI chatbot platform Character.AI has introduced a new "Books" feature that allows users to step inside classic literature and interact with characters through roleplay. While the move expands the platform's creative ambitions, it also arrives against a backdrop of mounting scrutiny over the real-world risks associated with AI chatbots. From Reading To Roleplay The new feature transforms public domain books into interactive experiences, letting users engage with stories like Alice in Wonderland or Pride and Prejudice as active participants rather than passive readers. Users can either follow the original narrative or deviate into alternate storylines, effectively turning literature into a dynamic, AI-driven roleplaying environment. This builds on Character.AI's core model, where users create and interact with bots based on fictional or real personalities, blurring the line between storytelling and simulated relationships. Researchers have noted that such interactions can feel similar to engaging with fictional characters in books or games - but with far deeper emotional immersion due to real-time conversation. A Platform Under Pressure The launch comes at a sensitive time for the company. Character.AI has faced lawsuits and criticism over alleged links between its chatbots and mental health crises among young users. In some cases, families have claimed that prolonged interactions with AI characters contributed to emotional dependency, isolation, and even suicide. Recommended Videos One widely reported case involved a teenager who developed an intense emotional bond with a chatbot, with legal claims alleging the AI failed to respond appropriately to expressions of self-harm. More broadly, experts warn that chatbots can sometimes reinforce harmful thoughts or fail to intervene effectively during mental health crises, particularly when users treat them as substitutes for real human support. Why This Matters Now Character.AI's Books feature highlights a larger shift in how people consume media. Instead of simply reading stories, users are now stepping into them, forming interactive and potentially emotional relationships with AI-driven characters. While this opens new creative possibilities, it also raises concerns about how deeply users - especially younger audiences - may immerse themselves in AI-generated worlds. The combination of narrative engagement and conversational AI can intensify emotional attachment, making it harder to distinguish fiction from reality. What Comes Next In response to growing criticism, Character.AI has already begun implementing safety measures, including restricting certain features for minors and experimenting with more structured experiences like Books mode. Going forward, the challenge will be balancing innovation with responsibility. Regulators, researchers, and tech companies are increasingly focused on defining safety standards for AI interactions, particularly in emotionally sensitive contexts. As AI continues to evolve from a tool into a companion-like presence, features like Books may represent the future of entertainment -- but also a test case for how safely that future can be built.
[3]
Character.AI expands storytelling with Books experience
Character.AI has launched a new feature called "Books" that allows users to roleplay and interact with characters from classic literature. This feature enables participants to engage with original narratives or create alternate storylines, transforming traditional reading into a more dynamic experience. The launch of "Books" expands Character.AI's core offering, where users create bots based on both fictional and real personalities. This model fosters deeper emotional immersion compared to standard reading or gaming experiences, according to researchers. The introduction of this feature occurs amid increasing scrutiny of AI chatbots and their potential risks. Character.AI has faced lawsuits alleging that its chatbots are linked to mental health crises, including emotional dependency and isolation among young users. One prominent case involved a teenager who formed a significant emotional bond with a chatbot, which is reported to have failed to respond appropriately to self-harm expressions. Experts caution that chatbots can reinforce harmful thoughts and may not provide adequate intervention during mental health crises. This has led to concerns about users, particularly younger individuals, immersing themselves deeply in AI-generated narratives and blurring lines between fiction and reality. In response to criticism, Character.AI has initiated safety measures, including feature restrictions for minors and a structured experience mode like Books. Moving forward, the challenge will include balancing innovation in AI with the imperative to protect users, especially in emotionally vulnerable situations. This evolution of AI from merely a tool to a companion-like presence highlights both potential for innovative entertainment and the necessity for responsible development as the platform seeks to redefine media consumption.
Share
Copy Link
Character.AI has unveiled a new Books feature that transforms classic literature into interactive roleplaying experiences, allowing users to engage with characters from works like Alice in Wonderland and Pride and Prejudice. The launch arrives amid intense scrutiny over the platform's safety record, including lawsuits linked to teen suicides and mental health crises among young users.
Character.AI has launched a new Books feature designed to turn books into roleplay bots, transforming classic literature into interactive roleplaying experiences
1
2
. The feature allows users to step inside public domain works from Project Gutenberg—a library of over 75,000 classic literature titles—and interact with characters in real time1
. Users can engage with beloved narratives like Alice in Wonderland, Pride and Prejudice, and Romeo and Juliet, either following the original plot or venturing into choose-your-own-adventure style alternate storylines1
2
.The Books feature builds on Character.AI's core model, where AI chatbots simulate fictional or real personalities to create immersive experiences. According to the company, the goal isn't "replacing books -- but making them impossible to ignore"
1
. Users can enter a story and interact with its world in real time, with the option to go "off script mode" or explore "alternative universe remixes" that reimagine the renowned works completely1
. Researchers note that such interactions create far deeper emotional immersion compared to traditional reading or gaming, as users engage in real-time conversation rather than passive consumption2
3
.The Books feature debuts against a backdrop of mounting scrutiny over Character.AI's safety record and its impact on mental health crises among young users
2
3
. The platform has faced multiple lawsuits alleging links between its AI chatbots and teen suicides, with at least three deaths reported following intense emotional dependency on the bots1
. One widely reported case involved a teenager who developed an intense emotional bond with a chatbot, with legal claims alleging the AI failed to respond appropriately to expressions of self-harm2
3
.Related Stories
Character.AI previously banned underage users from interacting with its bots entirely after facing criticism over inappropriate content, including bots modeled after real-world mass shooters and those designed to encourage eating disorders
1
. However, the Books feature appears available for minors, potentially sidestepping earlier content restrictions. Reports indicate that new accounts can access the roleplay without age-gating1
. The company has implemented some safety measures, including feature restrictions for minors and structured experiences like Books mode2
3
.Experts warn that AI chatbots can reinforce harmful thoughts or fail to intervene effectively during mental health crises, particularly when users treat them as substitutes for real human support
2
3
. The combination of narrative engagement and conversational AI can intensify emotional attachment, making it harder for users—especially younger audiences—to distinguish fiction from reality2
. This shift in AI-driven media consumption highlights both creative possibilities and risks as AI evolves from a tool into a companion-like presence2
3
. User engagement with these immersive experiences raises questions about ethical standards and whether platforms can balance innovation with protecting vulnerable users. Online reactions to the announcement have been largely negative, with users questioning the value of the update and expressing concerns about additional restrictions1
. As regulators and researchers focus on defining safety standards for AI interactions in emotionally sensitive contexts, the Books feature may serve as a test case for how responsibly the future of interactive storytelling can be built2
.
Source: Futurism
Summarized by
Navi
[1]
[2]
[3]
20 Nov 2025•Policy and Regulation

03 Jun 2025•Technology

29 Oct 2025•Policy and Regulation

1
Technology

2
Policy and Regulation

3
Policy and Regulation
