Curated by THEOUTPOST
On Thu, 18 Jul, 12:02 AM UTC
3 Sources
[1]
AI App That Claimed to Detect STDs From Dick Pics Shuts Down After It Turns Out It Doesn't Work at All
An AI-powered app called Calmara promised to offer users a quick and easy way to check for STDs. All you had to do was take a dick pic, send it in, and with the wonders of AI and science, you'd get a diagnosis -- "on the spot." If all that sounds extremely suspect, that's because it is. HeHealth, the company behind the app, has just been shut down after the Federal Trade Commission conducted an inquiry, The Verge reports, finding that its app's "clear, science-backed answers" it supposedly offered customers about their partner's "sexual health status" were anything but. "The FTC is so committed to protecting consumers that it is even willing to wade through pages of dick pics to protect Americans from AI scammers," an anonymous source familiar with the matter told The Verge. One of HeHealth's boldest claims that got it caught by the long schlong of the law was that Calmara had up to a 94.4 percent accuracy for detecting over 10 different sexually transmitted infections, among them syphilis, herpes, and HPV, according to a recent FTC letter. These capabilities were supposedly backed by a study published in a prestigious health journal. But as the FTC discovered, the app makers weren't being honest about its findings. For one, four out of five of the study's authors either worked for HeHealth or were paid consultants. As for the study itself, not only were some of the images used to train the AI detection model provided by people who never got an actual diagnostic test to confirm their condition, but HeHealth only assessed its app's capabilities on a "small" sample size, according to the FTC. Moreover, the study discloses that the AI was only trained and tested on four STIs -- not ten, as Calmara claimed. The FTC issued a subpoena called a civil investigative demand to HeHealth citing these findings last month. With the writing on the wall -- and the FTC breathing down its neck -- HeHealth decided to shut down Calmara by July 15, and said it would delete all customer data received through the app, dick pics included. It's been a long time coming, as the app's shortcomings were clear before the FTC got involved. Whatever glowing press it got early in the year was soon met by a deluge of damning reporting. Among them, an April investigation by the Los Angeles Times found that the app couldn't reliably distinguish between real penises and phallic objects, including a penis-shaped cake. It also found that Calmara struggled to identify explicit images of STIs in textbooks. With such a dubious track record, other facets of the app's marketing, like calling it "your intimate bestie for unprotected sex," are even more distressing. As The Verge notes, HeHealth tried to sell the app to women as a way to check their dates -- which is a consent nightmare waiting to happen. No doubt that one of the reasons HeHealth got away with it for as long as it did was by pinning its app's capabilities on the nebulous powers of an AI model -- as good an example as any that we're in an age of AI quackery.
[2]
Dating App That Used AI to Screen for STDs Gets Shut Down
It turns out that artificial intelligence may not be the greatest judge of your penis's character. A dating app that claimed it could use artificial intelligence to screen potential mates for "sexual health" problems (i.e., STDs) has been shut down after the Federal Trade Commission caught wind of the company's dubious claims. A company called HeHealth previously promoted what it called an "AI-powered sexually transmitted infection (“STIâ€) detection application," the likes of which hilariously asked male users to send in dick pics so that they could be screened for social diseases. The app, dubbed Calmara, would then use the magic of AI to assess whether the male member was healthy. HeHealth claimed that Calmara could detect as many as 10 different sexually transmitted diseases with up to 94 percent accuracy and, on its website, the company described its application as a "1 min AI-Powered Penis Health Checker": As you can imagine, there were problems with Calmara. For one thing, most STDs are invisible to the naked eye. For another thing, the app doesn't appear to have worked very well. Indeed, a previous investigation by the Los Angeles Times found that the app "struggled to distinguish between inanimate objects and human genitals, issuing a cheery “Clear!†to images of both a novelty penis-shaped vase and a penis-shaped cake." Yeah, that's not great. After the FTC noticed HeHealth's less-than-optimal test results, it opened an investigation into the company. In a letter sent to HeHealth and subsequently made public, the agency notes that the government "requires companies to have competent and reliable scientific evidence when making health-related claims," and that "substantiation for HeHealth’s" claims about its own services appeared to be "problematic for several reasons." Among other things, the agency found that HeHealth had tested the performance of its AI algorithm using “a relatively small number of images" and that while the company had claimed to be able to test for 10 STIs, a study associated with the algorithm claimed it could only identify four diseases. The FTC subsequently forced HeHealth to shut down Calmara and delete all customer data that had been obtained through it. The app's troubles were first spotted by The Verge. Following the shuttering of the app, the FTC said it would not continue with its investigation into the company. Gizmodo reached out to HeHealth for comment and will update this story if it responds. It's probably just as well that Calmara is dead. Speaking as someone who has written about cybersecurity for over five years now, it seems somewhat unwise for a company to be a walking dick pic database. That seems like a data breach (and, later, a class action lawsuit) just waiting to happen.
[3]
The most controversial dating app in history powered by AI has just closed its doors - Softonic
Artificial intelligence was supposed to take us to new heights of greatness, they said. The future was here, they said, and all we had to do was grab it and let ourselves be carried away, sliding down the fantastic highway of the conveniences we were going to have. And yet, here we are, talking about Calmara, an app where you can upload photos of your penis to find out if you have sexually transmitted diseases or if you're completely healthy. Spoiler: everything went to hell very quickly. Calmara was announced as an app to instantly know the sexual status of your partner - or your future partner - based on photos of their genitals. Yes, just as it sounds. Apparently, its system is capable of finding, with 94% reliability, more than ten possible STDs. There was only one problem, apart from the obvious privacy issue: photos were being uploaded without the other person's permission. Let's say you want to sleep with Mike tonight, but you're not sure if it will be completely safe. Well, take an intimate photo that he has sent you and upload it to Calmara. Without permission from your sexual partner. It's not hard to see where the underlying problem is here. But it gets worse because a study claimed that out of the ten possible diseases it claimed to diagnose, it only got four right, among other things because the data they used to train the AI model... included images of people who didn't even confirm their results. Want even more problematic things? Here you have them: Calmara's slogan was, for months, "your best friend for unprotected sex"... and it couldn't even distinguish between penises and inanimate objects. In the end, after the investigation, its developer HeHealth has agreed to shut it down forever and delete all data. It's not that we weren't prepared for the future: it's just that this was absolutely ridiculous. All things considered: he hasn't cared too much about closing it either because very, very, very few people had paid for the services of an app to send intimate photos instead of simply going to the doctor. We think we live in the future and we're still the same old country bumpkins.
Share
Share
Copy Link
A dating app that aimed to use artificial intelligence to detect STDs from users' intimate photos has been shut down. The app's controversial approach and potential privacy concerns led to its closure shortly after launch.
In a surprising development in the world of online dating, a controversial app that claimed to use artificial intelligence (AI) to detect sexually transmitted diseases (STDs) from users' intimate photos has been shut down. The app, which garnered significant attention for its unconventional approach, faced criticism and skepticism from various quarters before its abrupt closure 1.
The dating app, whose name has not been widely publicized, aimed to revolutionize the online dating scene by incorporating AI technology to screen for STDs. The concept involved users submitting intimate photos, colloquially known as "dick pics," which would then be analyzed by an AI algorithm to detect potential signs of STDs 2.
This unique approach was marketed as a way to promote safer dating practices and increase transparency among users. However, it immediately raised concerns about privacy, consent, and the scientific validity of such a screening method.
The app's requirement for users to submit highly sensitive and intimate photos sparked intense debate about digital privacy and data security. Critics argued that storing such personal images, even for purported health screening purposes, posed significant risks in the event of a data breach or misuse 3.
Moreover, ethical questions were raised about the app's approach to sexual health. Many experts in the fields of medicine and public health expressed skepticism about the accuracy and reliability of detecting STDs through AI analysis of photographs, emphasizing that proper medical testing remains the only reliable method for STD diagnosis.
The controversial nature of the app led to its swift shutdown shortly after its launch. While the exact reasons for the closure have not been officially disclosed, it is speculated that a combination of public backlash, potential legal challenges, and questions about the app's effectiveness contributed to the decision 1.
The app's short-lived existence serves as a cautionary tale about the intersection of technology, privacy, and health in the digital age. It highlights the need for careful consideration of ethical implications and scientific validity when developing AI-powered solutions, especially in sensitive areas such as sexual health and online dating.
A South Korean AI company's unsecured database exposed tens of thousands of AI-generated explicit images, including child sexual abuse material, highlighting the urgent need for regulation in the AI industry.
3 Sources
3 Sources
As dating apps face user fatigue and declining engagement, AI features are being introduced to revitalize the online dating experience. However, this trend raises questions about authenticity, privacy, and the future of human connections.
7 Sources
7 Sources
The notorious Russian hacking group FIN7 has launched a network of fake AI-powered deepnude generator sites to infect visitors with information-stealing malware, exploiting the growing interest in AI-generated content.
5 Sources
5 Sources
Match Group, owner of popular dating apps like Tinder and Hinge, is introducing AI-powered features to assist users in profile creation, photo selection, and messaging. While some see this as a solution to dating app fatigue, others warn of potential risks to authenticity and social skills.
2 Sources
2 Sources
Elon Musk encourages X users to share medical scans with Grok AI, sparking debates on privacy, accuracy, and ethical implications in healthcare AI.
6 Sources
6 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved