18 Sources
[1]
AI tool uses face photos to estimate biological age and predict cancer outcomes
"We can use artificial intelligence (AI) to estimate a person's biological age from face pictures, and our study shows that information can be clinically meaningful," said co-senior and corresponding author Hugo Aerts, PhD, director of the Artificial Intelligence in Medicine (AIM) program at Mass General Brigham. "This work demonstrates that a photo like a simple selfie contains important information that could help to inform clinical decision-making and care plans for patients and clinicians. How old someone looks compared to their chronological age really matters -- individuals with FaceAges that are younger than their chronological ages do significantly better after cancer therapy." When patients walk into exam rooms, their appearance may give physicians clues about their overall health and vitality. Those intuitive assessments combined with a patient's chronological age, in addition to many other biological measures, may help determine the best course of treatment. However, like anyone, physicians may have biases about a person's age that may influence them, fueling a need for more objective, predictive measures to inform care decisions. With that goal in mind, Mass General Brigham investigators leveraged deep learning and facial recognition technologies to train FaceAge. The tool was trained on 58,851 photos of presumed healthy individuals from public datasets. The team tested the algorithm in a cohort of 6,196 cancer patients from two centers, using photographs routinely taken at the start of radiotherapy treatment. Results showed that cancer patients appear significantly older than those without cancer, and their FaceAge, on average, was about five years older than their chronological age. In the cancer patient cohort, older FaceAge was associated with worse survival outcomes, especially in individuals who appeared older than 85, even after adjusting for chronological age, sex, and cancer type. Estimated survival time at the end of life is difficult to pin down but has important treatment implications in cancer care. The team asked 10 clinicians and researchers to predict short-term life expectancy from 100 photos of patients receiving palliative radiotherapy. While there was a wide range in their performance, overall, the clinicians' predictions were only slightly better than a coin flip, even after they were given clinical context, such as the patient's chronological age and cancer status. Yet when clinicians were also provided with the patient's FaceAge information, their predictions improved significantly. Further research is needed before this technology could be considered for use in a real-world clinical setting. The research team is testing this technology to predict diseases, general health status, and lifespan. Follow-up studies include expanding this work across different hospitals, looking at patients in different stages of cancer, tracking FaceAge estimates over time, and testing its accuracy against plastic surgery and makeup data sets. "This opens the door to a whole new realm of biomarker discovery from photographs, and its potential goes far beyond cancer care or predicting age," said co-senior author Ray Mak, MD, a faculty member in the AIM program at Mass General Brigham. "As we increasingly think of different chronic diseases as diseases of aging, it becomes even more important to be able to accurately predict an individual's aging trajectory. I hope we can ultimately use this technology as an early detection system in a variety of applications, within a strong regulatory and ethical framework, to help save lives."
[2]
Scientists use AI facial analysis to predict cancer survival outcomes
Scientists have used artificial intelligence analysis of the faces of cancer patients to predict survival outcomes and in some cases outperform clinicians' short-term life expectancy forecasts. The researchers used a deep learning algorithm to measure the biological age of subjects and found that the features of cancer sufferers appeared on average about five years older than their chronological ages. The new technological tool, known as FaceAge, is part of a growing push to use estimates of ageing in bodily organs as so-called biomarkers of potential disease risks. Advances in AI have boosted these efforts because of its ability to learn from large health data sets and make risk projections based on them. The research showed the information derived from pictures of faces could be "clinically meaningful", said Hugo Aerts, co-senior author of a paper on the study published in Lancet Digital Health on Thursday. "This work demonstrates that a photo like a simple selfie contains important information that could help to inform clinical decision-making and care plans for patients and clinicians," said Aerts, director of AI in Medicine at Massachusetts-based Mass General Brigham. "How old someone looks compared to their chronological age really matters -- individuals with FaceAges that are younger than their chronological ages do significantly better after cancer therapy", he added. The scientists trained FaceAge on 58,851 photos of presumed healthy people from public data sets. They then tested the algorithm on 6,196 cancer patients, using photos taken at the start of radiotherapy. Among the cancer patients, the older the FaceAge, the worse the survival outcome, even after adjusting for chronological age, sex and cancer type. The effect was especially pronounced for people who appeared over 85. The scientists then asked 10 clinicians and researchers to predict whether patients receiving palliative radiotherapy for advanced cancers would be alive after six months The human assessors were right about 61 per cent of the time when they had access only to a patient photo, but that improved to 80 per cent when they had FaceAge analysis too. Possible limitations of FaceAge include biases in the data and the potential for readings to reflect errors in the model rather than actual differences between chronological and biological age, the research team said. The scientists are now testing the technology on a wider range of patients, as well as assessing its ability to predict diseases, general health status and lifespan. The study of biomarkers for ageing is a subject of intense research activity. In February, scientists unveiled a simple blood test to detect how fast internal organs age and help flag increased risks for 30 diseases, including lung cancer. Face ageing is an area of growing interest, with scientists exploring various techniques. One is the concept of perceived ageing: in other words, how old a person looks to experienced healthcare professionals rather than how old they are biologically. Perceived ageing has emerged as a potential predictor of mortality and several age-related diseases, researchers say. The drawback is generating the data by human observation is time-consuming and costly. The evaluation of FaceAge appeared to be "quite thorough", said Jaume Bacardit, a Newcastle University AI specialist who has done work applying the technology to perceived ageing. But there needed to be more explanation of how the AI technique worked, to check for potential distorting factors, he added. "That is, which parts of the face are they basing their predictions on?" Bacardit said. "This will help identify potential confounders that may go undetected otherwise."
[3]
AI tool uses facial images to predict biological age and cancer survival
Mass General BrighamMay 8 2025 Eyes may be the window to the soul, but a person's biological age could be reflected in their facial characteristics. Investigators from Mass General Brigham developed a deep learning algorithm called FaceAge that uses a photo of a person's face to predict biological age and survival outcomes for patients with cancer. They found that patients with cancer, on average, had a higher FaceAge than those without and appeared about five years older than their chronological age. Older FaceAge predictions were associated with worse overall survival outcomes across multiple cancer types. They also found that FaceAge outperformed clinicians in predicting short-term life expectancies of patients receiving palliative radiotherapy. Their results are published in The Lancet Digital Health. We can use artificial intelligence (AI) to estimate a person's biological age from face pictures, and our study shows that information can be clinically meaningful. This work demonstrates that a photo like a simple selfie contains important information that could help to inform clinical decision-making and care plans for patients and clinicians. How old someone looks compared to their chronological age really matters-individuals with FaceAges that are younger than their chronological ages do significantly better after cancer therapy." Hugo Aerts, PhD, co-senior and corresponding author, director of the Artificial Intelligence in Medicine (AIM) program at Mass General Brigham When patients walk into exam rooms, their appearance may give physicians clues about their overall health and vitality. Those intuitive assessments combined with a patient's chronological age, in addition to many other biological measures, may help determine the best course of treatment. However, like anyone, physicians may have biases about a person's age that may influence them, fueling a need for more objective, predictive measures to inform care decisions. With that goal in mind, Mass General Brigham investigators leveraged deep learning and facial recognition technologies to train FaceAge. The tool was trained on 58,851 photos of presumed healthy individuals from public datasets. The team tested the algorithm in a cohort of 6,196 cancer patients from two centers, using photographs routinely taken at the start of radiotherapy treatment. Results showed that cancer patients appear significantly older than those without cancer, and their FaceAge, on average, was about five years older than their chronological age. In the cancer patient cohort, older FaceAge was associated with worse survival outcomes, especially in individuals who appeared older than 85, even after adjusting for chronological age, sex, and cancer type. Estimated survival time at the end of life is difficult to pin down but has important treatment implications in cancer care. The team asked 10 clinicians and researchers to predict short-term life expectancy from 100 photos of patients receiving palliative radiotherapy. While there was a wide range in their performance, overall, the clinicians' predictions were only slightly better than a coin flip, even after they were given clinical context, such as the patient's chronological age and cancer status. Yet when clinicians were also provided with the patient's FaceAge information, their predictions improved significantly. Further research is needed before this technology could be considered for use in a real-world clinical setting. The research team is testing this technology to predict diseases, general health status, and lifespan. Follow-up studies include expanding this work across different hospitals, looking at patients in different stages of cancer, tracking FaceAge estimates over time, and testing its accuracy against plastic surgery and makeup data sets. "This opens the door to a whole new realm of biomarker discovery from photographs, and its potential goes far beyond cancer care or predicting age," said co-senior author Ray Mak, MD, a faculty member in the AIM program at Mass General Brigham. "As we increasingly think of different chronic diseases as diseases of aging, it becomes even more important to be able to accurately predict an individual's aging trajectory. I hope we can ultimately use this technology as an early detection system in a variety of applications, within a strong regulatory and ethical framework, to help save lives." Mass General Brigham Journal reference: Bontempi, D., et al. (2025) FaceAge, a deep learning system to estimate biological age from face photographs to improve prognostication: a model development and validation study. The Lancet Digital Health. doi.org/10.1016/j.landig.2025.03.002.
[4]
AI can tell how old your body really is and how quickly you're aging using just a selfie
A new AI model can deduce a person's biological age using a selfie. Could it be used to guide cancer treatment decisions? A new artificial intelligence (AI) model can predict a person's biological age -- the state of their body and how they're aging -- from a selfie. The model, dubbed FaceAge, estimates how old a person looks compared to their chronological age, or the amount of time that's passed since their birth. FaceAge's makers say their tool could help doctors decide on the best course of treatment for diseases like cancer. But one outside expert told Live Science that before it is used that way, follow-up data needs to show it actually improves treatment outcomes or quality of life. When a doctor is treating a cancer patient, "one of the first things they do is they try to assess how well the individual is doing," Hugo Aerts, director of the AI in Medicine Program at Mass General Brigham, said in a news briefing on May 7. "This is often a very subjective assessment, but it can influence a lot of future decisions" about their treatment, including how aggressive or intense their treatment plan should be, he added. For example, doctors may decide a patient who looks younger and more fit for their age may tolerate an aggressive treatment better and eventually live longer than a patient who looks older and more frail, even if the two have the same chronological age. FaceAge could make that decision easier by turning doctors' subjective estimates into a quantitative measure, the study authors wrote in the new study published May 8 in the journal Lancet Digital Health. By quantifying biological age, the model could offer another data point in helping doctors decide which treatment to recommend. Aerts and his colleagues trained the model on more than 58,000 photos of people ages 60 years and older who were assumed to be of average health for their age at the time the photo was taken. In this training set, the researchers had the model estimate chronological ages and assumed that the people's biological ages were similar, though the scientists noted that this assumption is not true in every case. The team then used FaceAge to predict the ages of more than 6,000 people with cancer. Cancer patients looked about five years older, on average, than their chronological ages, the team found. FaceAge's estimates also correlated with survival after treatment: The older a person looked, regardless of their chronological age, the lower their chances of living longer. By contrast, chronological age was not a good predictor of survival in cancer patients, the team found. FaceAge isn't ready for hospitals or physicians' offices yet. For one, the dataset used to train the model was pulled from IMDb and Wikipedia -- which may not represent the general population, and may also not account for factors like plastic surgery, lifestyle differences, or images that have been digitally retouched. Further studies with larger and more representative training sets are needed to understand how those factors impact FaceAge estimations, the authors said. And the researchers are still improving the algorithm with additional training data and testing its efficacy for other conditions besides cancer. They're also investigating what factors the model draws on to make its predictions. But once it's finalized, FaceAge could, for example, help doctors tailor the intensity of cancer treatments like radiation and chemotherapy to specific patients, study co-author Dr. Ray Mak, a radiation oncologist at Mass General Brigham, said during the briefing. A clinical trial for cancer patients, comparing FaceAge to more traditional measures of a patient's frailty, is starting soon, Mak added. Ethical guidelines surrounding how FaceAge information can be used, such as whether health insurance or life insurance providers could access FaceAge estimates to make coverage decisions, should be established before rolling out the model, the researchers said. "It is for sure something that needs attention, to assure that these technologies are used only for the benefit of the patient," Aerts said in the briefing. Doctors would also need to carefully consider when and how they use FaceAge in clinical settings, said Nicola White, a palliative care researcher at University College London who was not involved in the study. "When you're dealing with people, it's very different to dealing with statistics," White told Live Science. A long-term study assessing whether involving FaceAge in treatment decisions improved patients' quality of life is needed, she said. The researchers noted the AI tool wouldn't be making calls about treatment on its own. "It's not a replacement for clinician judgement," Mak said. But FaceAge could become part of a physician's toolkit for personalizing a treatment plan, "like having another vital sign data point."
[5]
AI Tool Reads Faces to Predict Health, Aging, and Cancer Outcomes - Neuroscience News
Summary: Researchers have developed an AI tool called FaceAge that uses facial photos to estimate biological age and predict survival outcomes in cancer patients. In a study involving over 6,000 patients, those with cancer had FaceAges about five years older than their chronological age, and higher FaceAges were linked to poorer survival. The tool outperformed clinicians in predicting short-term life expectancy for patients receiving palliative radiotherapy, especially when integrated into their decision-making. These findings suggest that facial features could serve as powerful, non-invasive biomarkers for aging and disease, opening new doors in precision medicine. Eyes may be the window to the soul, but a person's biological age could be reflected in their facial characteristics. Investigators from Mass General Brigham developed a deep learning algorithm called FaceAge that uses a photo of a person's face to predict biological age and survival outcomes for patients with cancer. They found that patients with cancer, on average, had a higher FaceAge than those without and appeared about five years older than their chronological age. Older FaceAge predictions were associated with worse overall survival outcomes across multiple cancer types. They also found that FaceAge outperformed clinicians in predicting short-term life expectancies of patients receiving palliative radiotherapy. Their results are published in The Lancet Digital Health. "We can use artificial intelligence (AI) to estimate a person's biological age from face pictures, and our study shows that information can be clinically meaningful," said co-senior and corresponding author Hugo Aerts, PhD, director of the Artificial Intelligence in Medicine (AIM) program at Mass General Brigham. "This work demonstrates that a photo like a simple selfie contains important information that could help to inform clinical decision-making and care plans for patients and clinicians. "How old someone looks compared to their chronological age really matters -- individuals with FaceAges that are younger than their chronological ages do significantly better after cancer therapy." When patients walk into exam rooms, their appearance may give physicians clues about their overall health and vitality. Those intuitive assessments combined with a patient's chronological age, in addition to many other biological measures, may help determine the best course of treatment. However, like anyone, physicians may have biases about a person's age that may influence them, fueling a need for more objective, predictive measures to inform care decisions. With that goal in mind, Mass General Brigham investigators leveraged deep learning and facial recognition technologies to train FaceAge. The tool was trained on 58,851 photos of presumed healthy individuals from public datasets. The team tested the algorithm in a cohort of 6,196 cancer patients from two centers, using photographs routinely taken at the start of radiotherapy treatment. Results showed that cancer patients appear significantly older than those without cancer, and their FaceAge, on average, was about five years older than their chronological age. In the cancer patient cohort, older FaceAge was associated with worse survival outcomes, especially in individuals who appeared older than 85, even after adjusting for chronological age, sex, and cancer type. Estimated survival time at the end of life is difficult to pin down but has important treatment implications in cancer care. The team asked 10 clinicians and researchers to predict short-term life expectancy from 100 photos of patients receiving palliative radiotherapy. While there was a wide range in their performance, overall, the clinicians' predictions were only slightly better than a coin flip, even after they were given clinical context, such as the patient's chronological age and cancer status. Yet when clinicians were also provided with the patient's FaceAge information, their predictions improved significantly. Further research is needed before this technology could be considered for use in a real-world clinical setting. The research team is testing this technology to predict diseases, general health status, and lifespan. Follow-up studies include expanding this work across different hospitals, looking at patients in different stages of cancer, tracking FaceAge estimates over time, and testing its accuracy against plastic surgery and makeup data sets. "This opens the door to a whole new realm of biomarker discovery from photographs, and its potential goes far beyond cancer care or predicting age," said co-senior author Ray Mak, MD, a faculty member in the AIM program at Mass General Brigham. "As we increasingly think of different chronic diseases as diseases of aging, it becomes even more important to be able to accurately predict an individual's aging trajectory. I hope we can ultimately use this technology as an early detection system in a variety of applications, within a strong regulatory and ethical framework, to help save lives." Authorship: Additional Mass General Brigham authors include Dennis Bontempi, Osbert Zalay, Danielle S. Bitterman, Fridolin Haugg, Jack M. Qian, Hannah Roberts, Subha Perni, Vasco Prudente, Suraj Pai, Christian Guthier, Tracy Balboni, Laura Warren, Monica Krishan, and Benjamin H. Kann. Disclosures: Mass General Brigham has filed provisional patents on two next-generation facial health algorithms. Funding: This project received financial support from the National Institutes of Health (HA: NIH-USA U24CA194354, NIH-USA U01CA190234, NIH-USA U01CA209414, and NIH-USA R35CA22052; BHK: NIH-USA K08DE030216-01), and the European Union - European Research Council (HA: 866504). Author: Ryan Jaslow Source: Mass General Contact: Ryan Jaslow - Mass General Image: The image is credited to Neuroscience News Original Research: Open access. "FaceAge, a deep learning system to estimate biological age from face photographs to improve prognostication: a model development and validation study" by Hugo Aerts et al. Lancet Digital Health Abstract FaceAge, a deep learning system to estimate biological age from face photographs to improve prognostication: a model development and validation study As humans age at different rates, physical appearance can yield insights into biological age and physiological health more reliably than chronological age. In medicine, however, appearance is incorporated into medical judgements in a subjective and non-standardised way. In this study, we aimed to develop and validate FaceAge, a deep learning system to estimate biological age from easily obtainable and low-cost face photographs. FaceAge was trained on data from 58β851 presumed healthy individuals aged 60 years or older: 56β304 individuals from the IMDb-Wiki dataset (training) and 2547 from the UTKFace dataset (initial validation). Clinical utility was evaluated on data from 6196 patients with cancer diagnoses from two institutions in the Netherlands and the USA: the MAASTRO, Harvard Thoracic, and Harvard Palliative cohorts FaceAge estimates in these cancer cohorts were compared with a non-cancerous reference cohort of 535 individuals. To assess the prognostic relevance of FaceAge, we performed Kaplan-Meier survival analysis and Cox modelling, adjusting for several clinical covariates. We also assessed the performance of FaceAge in patients with metastatic cancer receiving palliative treatment at the end of life by incorporating FaceAge into clinical prediction models. To evaluate whether FaceAge has the potential to be a biomarker for molecular ageing, we performed a gene-based analysis to assess its association with senescence genes. FaceAge showed significant independent prognostic performance in various cancer types and stages. Looking older was correlated with worse overall survival (after adjusting for covariates per-decade hazard ratio [HR] 1Β·151, p=0Β·013 in a pan-cancer cohort of n=4906; 1Β·148, p=0Β·011 in a thoracic cohort of n=573; and 1Β·117, p=0Β·021 in a palliative cohort of n=717). We found that, on average, patients with cancer looked older than their chronological age (mean increase of 4Β·79 years with respect to non-cancerous reference cohort, p<0Β·0001). We found that FaceAge can improve physicians' survival predictions in patients with incurable cancer receiving palliative treatments (from area under the curve 0Β·74 [95% CI 0Β·70-0Β·78] to 0Β·8 [0Β·76-0Β·83]; p<0Β·0001), highlighting the clinical use of the algorithm to support end-of-life decision making. FaceAge was also significantly associated with molecular mechanisms of senescence through gene analysis, whereas age was not. Our results suggest that a deep learning model can estimate biological age from face photographs and thereby enhance survival prediction in patients with cancer. Further research, including validation in larger cohorts, is needed to verify these findings in patients with cancer and to establish whether the findings extend to patients with other diseases. Subject to further testing and validation, approaches such as FaceAge could be used to translate a patient's visual appearance into objective, quantitative, and clinically valuable measures. US National Institutes of Health and EU European Research Council.
[6]
AI tool uses face photos to estimate biological age and predict cancer outcomes
Eyes may be the window to the soul, but a person's biological age could be reflected in their facial characteristics. Investigators from Mass General Brigham developed a deep learning algorithm called "FaceAge" that uses a photo of a person's face to predict biological age and survival outcomes for patients with cancer. They found that patients with cancer, on average, had a higher FaceAge than those without and appeared about five years older than their chronological age. Older FaceAge predictions were associated with worse overall survival outcomes across multiple cancer types. They also found that FaceAge outperformed clinicians in predicting short-term life expectancies of patients receiving palliative radiotherapy. Their results are published in The Lancet Digital Health. "We can use artificial intelligence (AI) to estimate a person's biological age from face pictures, and our study shows that information can be clinically meaningful," said co-senior and corresponding author Hugo Aerts, Ph.D., director of the Artificial Intelligence in Medicine (AIM) program at Mass General Brigham. "This work demonstrates that a photo like a simple selfie contains important information that could help to inform clinical decision-making and care plans for patients and clinicians. How old someone looks compared to their chronological age really matters -- individuals with FaceAges that are younger than their chronological ages do significantly better after cancer therapy." When patients walk into exam rooms, their appearance may give physicians clues about their overall health and vitality. Those intuitive assessments combined with a patient's chronological age, in addition to many other biological measures, may help determine the best course of treatment. However, like anyone, physicians may have biases about a person's age that may influence them, fueling a need for more objective, predictive measures to inform care decisions. With that goal in mind, Mass General Brigham investigators leveraged deep learning and facial recognition technologies to train FaceAge. The tool was trained on 58,851 photos of presumed healthy individuals from public datasets. The team tested the algorithm in a cohort of 6,196 cancer patients from two centers, using photographs routinely taken at the start of radiotherapy treatment. Results showed that cancer patients appear significantly older than those without cancer, and their FaceAge, on average, was about five years older than their chronological age. In the cancer patient cohort, older FaceAge was associated with worse survival outcomes, especially in individuals who appeared older than 85, even after adjusting for chronological age, sex, and cancer type. Estimated survival time at the end of life is difficult to pin down but has important treatment implications in cancer care. The team asked 10 clinicians and researchers to predict short-term life expectancy from 100 photos of patients receiving palliative radiotherapy. While there was a wide range in their performance, overall, the clinicians' predictions were only slightly better than a coin flip, even after they were given clinical context, such as the patient's chronological age and cancer status. Yet when clinicians were also provided with the patient's FaceAge information, their predictions improved significantly. Further research is needed before this technology could be considered for use in a real-world clinical setting. The research team is testing this technology to predict diseases, general health status, and lifespan. Follow-up studies include expanding this work across different hospitals, looking at patients in different stages of cancer, tracking FaceAge estimates over time, and testing its accuracy against plastic surgery and makeup data sets. "This opens the door to a whole new realm of biomarker discovery from photographs, and its potential goes far beyond cancer care or predicting age," said co-senior author Ray Mak, MD, a faculty member in the AIM program at Mass General Brigham. "As we increasingly think of different chronic diseases as diseases of aging, it becomes even more important to be able to accurately predict an individual's aging trajectory. I hope we can ultimately use this technology as an early detection system in a variety of applications, within a strong regulatory and ethical framework, to help save lives."
[7]
Scientists Are Developing a Tool to Measure Biological Age With a Photo
Sign up for the Well newsletter, for Times subscribers only. Essential news and guidance to live your healthiest life. Get it with a Times subscription. It's no secret that some people appear to age faster than others, especially after enduring stressful periods. But some scientists think a person's physical appearance could reveal more about them than meets the eye -- down to the health of their tissues and cells, a concept known as "biological age." In a new study, published Thursday in The Lancet Digital Health, researchers trained artificial intelligence to estimate the biological ages of adults with cancer by analyzing photos of their faces. Study participants with younger estimates tended to fare better after treatment than those deemed older by A.I., researchers at Mass General Brigham found. The findings suggest that people's biological age estimates are closely linked to their physical health, which could reflect their ability to survive certain treatments, the authors of the study said. And in the future, facial age analysis may become more useful than age alone in helping doctors make tough calls about their patients' treatment, they added. Face-based aging tools have "extraordinary potential" to help doctors quickly and inexpensively estimate how healthy their patients are, compared with existing tests, which use blood or saliva to measure chemical and molecular changes associated with aging, said William Mair, a professor of molecular metabolism at the Harvard T.H. Chan School of Public Health who was not involved in the study. While doctors usually visually estimate how healthy their patients are for their age, a tool like this could draw in much more data to make a better estimate, he added. FaceAge, the machine learning tool created by researchers at Mass General Brigham, found that study subjects with cancer appeared five years older than their chronological age. The biological age of people without cancer was typically close to their actual age. And those who were categorized as older were more likely to die, either from cancer or other causes. The researchers are not the first to find a link between facial and biological aging: A study in Denmark found that subjects who looked older than their chronological age tended to die earlier than their twins, and other studies have come to similar conclusions. FaceAge was trained on a database of more than 56,000 images of people age 60 and older, mostly sourced from Wikipedia and the movie database I.M.D.B. The researchers then asked it to assess the age of study participants, most of whom had cancer, using photographs alone. Doctors could one day use FaceAge to decide whether to provide different treatment depending on a patient's estimated biological age, said Dr. Raymond H. Mak, a radiation oncologist at Mass General Brigham who worked on the study. Toni Feather, a 69-year-old hairdresser and a cancer patient under Dr. Mak's care, was one of the study participants who looked younger than her chronological age. Mrs. Feather, who lives in Upton, Mass., said Dr. Mak explained that her appearance -- which was roughly 10 years younger than her age -- could reflect biological resilience, which may have helped her withstand grueling treatments. (Mrs. Feather has undergone several rounds of surgery, chemotherapy and radiation for lung cancer, but she continues to work once a week and regularly cares for her young grandson.) Preliminary data suggests that FaceAge goes beyond the visual markers of age we might look to, like wrinkles, gray hair or baldness, and instead flags less obvious factors like hollowing of the temples (which reflects a loss of muscle mass) and the prominence of the skin folds on either side of the mouth, Dr. Mak said. The authors of the study hope to eventually commercialize the technology and create a product that could be used in doctor's offices. They plan to file for a patent once the technology is more developed. The current version of the tool has limitations. It was primarily trained on white faces, Dr. Mak said, so it could work differently for people with different skin tones. And it isn't clear to what extent modifications like plastic surgery, makeup, lighting or the angle of the face could affect the results. And while biological aging can be accelerated by a number of factors, like stress, pregnancy, smoking, drinking alcohol and even extreme heat, some of these changes can be reversible -- and it's not clear if the tool would pick up those changes over time. Experts in medical ethics also have concerns. "I'd be very worried about whether this tool works equally well for all populations, for example women, older adults, racial and ethnic minorities, those with various disabilities, pregnant women and the like," said Jennifer E. Miller, the co-director of the program for biomedical ethics at Yale University. She and others in the field also wondered whether the tool might be used to justify denying insurance coverage or medical treatment. Dr. Mak and other researchers who worked on the study have had reservations, too. "We're really concerned about potential misuse of technology in general," he said. However, he added, the researchers felt the tool would be more helpful than harmful -- and it could be used to support, but not replace, clinicians' judgment. It's unclear whether FaceAge's results will be more accurate, more scalable or cheaper than the results from existing tools for estimating biological age, said Daniel Belsky, a Columbia University epidemiologist and associate professor who co-led the development of DunedinPACE, a widely used epigenetic clock. "There's a long way between where we are today and actually using these tools in a clinical setting," Dr. Belsky said.
[8]
AI tool uses selfies to predict biological age and cancer survival
Doctors often start exams with the so-called "eyeball test" -- a snap judgment about whether the patient appears older or younger than their age, which can influence key medical decisions. That intuitive assessment may soon get an AI upgrade. FaceAge, a deep learning algorithm described Thursday in The Lancet Digital Health, converts a simple headshot into a number that more accurately reflects a person's biological age rather than the birthday on their chart. Trained on tens of thousands of photographs, it pegged cancer patients on average as biologically five years older than healthy peers. The study's authors say it could help doctors decide who can safely tolerate punishing treatments, and who might fare better with a gentler approach. "We hypothesize that FaceAge could be used as a biomarker in cancer care to quantify a patient's biological age and help a doctor make these tough decisions," said co-senior author Raymond Mak, an oncologist at Mass Brigham Health, a Harvard-affiliated health system in Boston. Consider two hypothetical patients: a spry 75βyearβold whose biological age clocks in at 65, and a frail 60βyearβold whose biology reads 70. Aggressive radiation might be appropriate for the former but risky for the latter. Growing evidence shows humans age at different rates, shaped by genes, stress, exercise, and habits like smoking or drinking. While pricey genetic tests can reveal how DNA wears over time, FaceAge promises insight using only a selfie. The model was trained on 58,851 portraits of presumed-healthy adults over 60, culled from public datasets. It was then tested on 6,196 cancer patients treated in the United States and the Netherlands, using photos snapped just before radiotherapy. Patients with malignancies looked on average 4.79 years older biologically than their chronological age. Among cancer patients, a higher FaceAge score strongly predicted worse survival -- even after accounting for actual age, sex, and tumor type -- and the hazard rose steeply for anyone whose biological reading tipped past 85. Intriguingly, FaceAge appears to weigh the signs of aging differently than humans do. For example, being gray-haired or balding matters less than subtle changes in facial muscle tone. FaceAge boosted doctors' accuracy, too. Eight physicians were asked to examine headshots of terminal cancer patients and guess who would die within six months. Their success rate barely beat chance; with FaceAge data in hand, predictions improved sharply. The model even affirmed a favorite internet meme, estimating actor Paul Rudd's biological age as 43 in a photo taken when he was 50. Bias and ethics guardrails AI tools have faced scrutiny for underβserving non-white people. Mak said preliminary checks revealed no significant racial bias in FaceAge's predictions, but the group is training a secondβgeneration model on 20,000 patients. They're also probing how factors like makeup, cosmetic surgery or room lighting variations could fool the system. Ethics debates loom large. An AI that can read biological age from a selfie could prove a boon for clinicians, but also tempting for life insurers or employers seeking to gauge risk. "It is for sure something that needs attention, to assure that these technologies are used only in the benefit for the patient," said Hugo Aerts, the study's co-lead who directs MGB's AI in medicine program. Another dilemma: What happens when the mirror talks back? Learning that your body is biologically older than you thought may spur healthy changes -- or sow anxiety. The researchers are planning to open a public-facing FaceAge portal where people can upload their own pictures to enroll in a research study to further validate the algorithm. Commercial versions aimed at clinicians may follow, but only after more validation.
[9]
New AI tool predicts your biological age from a selfie
Our faces suggest our true age and even how much time we may have left on Earth. While doctors learn to form a picture of a patient's health from their face, using what they call "the eyeball test," new research in the Lancet Digital Health indicates that this may be a job that artificial intelligence can enhance in the future. Scientists at Mass General Brigham in Boston have developed and carried out some initial testing of an AI tool called FaceAge, an algorithm designed to tell patients' biological age from a photograph as simple as a selfie -- not how old they are in years, but how old they are in health. Biological age is considered crucial in helping doctors determine the most appropriate therapy, such as whether a cancer patient is healthy enough to tolerate an aggressive treatment. FaceAge requires more testing before doctors can begin using it routinely, but scientists said that in the next week or two, they expect to begin enrolling about 50 patients in a pilot study. Researchers said they trained FaceAge using about 59,000 photographs of people ages 60 and older who were presumed to be healthy. Most of the photos were publicly available on Wikipedia and the internet movie database IMDb, while some came from UTKFace, a large-scale dataset with pictures of people from less than a year old to 116 years old. The developers of FaceAge tested it on a group of 6,200 cancer patients, using photographs taken at the start of radiotherapy treatment. The algorithm determined that when it came to their health, cancer patients were, on average, about five years older than their chronological age. Moreover, the tool found that the older their faces looked, the worse their survival outlook. Scientists then conducted an experiment in which they asked eight doctors to determine whether or not terminal cancer patients would be alive in six months based first on the patient's photograph alone, then on the photograph and clinical information, and finally based on FaceAge and clinical information. "We found that doctors on average can predict life expectancy with an accuracy that's only a little better than a coin flip," when using a photo alone for their analysis, said Raymond Mak, a radiation oncologist at Mass General Brigham and one of the lead investigators for the study. When using a photo alone, the doctors were right about 61 percent of the time. Given photos and clinical information on the patients, they were right about 74 percent of the time. Provided with FaceAge and medical chart information, the doctors' accuracy reached 80 percent. In a news conference last week, Mak said he tested FaceAge using the photograph of a patient whom he first met four years ago, an 86-year-old man with terminal lung cancer. "Some doctors would hesitate to offer cancer treatment to someone in their late 80s or 90s with the rationale that the patient may die from other causes before the cancer progresses and becomes life-threatening," Mak said. "But he looked younger than 86 to me, and based on the eyeball test and a host of other factors, I decided to treat him with aggressive radiation therapy." Several years later, when Mak ran the man's photograph through FaceAge, "we found he's more than 10 years younger than his chronological age." The patient is now 90, "and still doing great," Mak said. Researchers stressed that FaceAge is not intended to replace a doctor's assessment, but rather to provide an objective measure to help fill out a clinical picture. "I think it's really important to know that different people age at different rates, and, as they're showing here, that clearly seems to have a major effect on their actual prognosis," said Gary Schwartz, a scientist at University Health Network's Princess Margaret Cancer Centre in Toronto, who was not involved in the research. Having a face with a lot of mileage on it, however, does not necessarily doom someone to an early grave. In a second demonstration of FaceAge, the developers had it analyze photographs of actors Paul Rudd and Wilford Brimley when each man was 50. The algorithm determined that Rudd's biological age was about 43; Brimley's was almost 69 (though he would live to 85). Irbaz Riaz, an assistant professor of medicine and senior associate consultant in the Department of AI and Informatics at Mayo Clinic, called FaceAge "a promising early-stage tool." While the tool does not replace a doctor's experience, said Riaz, who did not work on the study, "it could standardize the subtle visual assessments we make every day. That said, clinicians will need to understand how the model was trained, when it might be biased, and where it could add value without overstepping its role." Nasim Eftekhari, vice president of applied AI and advanced analytics at City of Hope cancer treatment and research center in Duarte, California, who did not participate in the study, called the tool "an incremental improvement," saying, "if this goes through validation and bias testing and approvals, and all of that, this could be an additional biomarker, at best," to go with cancer stage, characteristics of the tumor and other factors. Developers of FaceAge acknowledged that should the technology win approval from the Food and Drug Administration, ethical guidelines will need to be established to govern its use and access to its information. "This technology can do a lot of good, but it could also potentially do some harm," said Hugo Aerts, director of the Artificial Intelligence in Medicine (AIM) program at Mass General Brigham and another lead investigator on the FaceAge study. Aerts said hospitals have "very strong governance committees and regulatory guidelines that they have to adhere to [to] make sure these AI technologies are being used in the right way, really only for the benefit of the patients," and not for others, such as insurers. Privacy has been a key concern for earlier technologies that attempt to show how we might use age based on a photo, such as FaceApp. Although initial testing of FaceAge focused on cancer patients, the scientists plan also to measure its performance for other conditions. Aerts said that the tool still needs to be trained to deal with numerous variables that can affect a photograph of a face: lighting, makeup, skin tone and, of course, our attempts to look younger through plastic surgery. "So this is something that we are actively investigating and researching," Aerts said. "We're now testing in various datasets [to see] how we can make the algorithm robust against this." While the tool still has much to learn, we may have something to learn from it, too. "It is important to know that the algorithm looks at age differently than humans do," Aerts said. "So, for example, being bald or not, or being gray is less important in the algorithm than we actually initially thought." Mak said the scientists are still trying to figure out what features FaceAge focuses on in the photographs when it estimates biological age.
[10]
AI tool can analyze selfies to predict cancer risk - Earth.com
A new study shows that an algorithm can look at an ordinary photograph and estimate how fast a body is aging - an insight that could change the nature of cancer care. The research was led by a team of scientists from Mass General Brigham, who constructed FaceAge, an AI tool. The experts trained the tool on almost 60,000 images of healthy individuals and then tested it on more than 6,000 patients who were starting radiotherapy. The team found that the typical cancer patient appeared about five years older than their birth certificate suggested, and that each additional year shortened their life expectancy. FaceAge is a deep-learning network. It studies fine details - skin texture, muscle tone, and eye shape - then translates those patterns into a single number: your biological age. The process is automatic. If FaceAge receives a head-and-shoulders photo, it can predict an age that reflects the body's wear and tear better than the chronological age. How old a person seems has long guided doctors informally. Frail features can steer therapy toward gentler options; youthful vigor can justify aggressive treatment. Yet such judgments are subjective. FaceAge puts a more objective figure on that impression. In the study, patients whose biological age topped 85 fared worst, even after the authors adjusted for sex, tumor site, and chronological age. "We can use artificial intelligence (AI) to estimate a person's biological age from face pictures, and our study shows that information can be clinically meaningful," said Hugo Aerts, the director of the Artificial Intelligence in Medicine (AIM) program at Mass General Brigham. "This work demonstrates that a photo, like a simple selfie, contains important information that could help to inform clinical decision-making." "How old someone looks compared to their chronological age really matters - individuals with FaceAges that are younger than their chronological ages do significantly better after cancer therapy." Predicting how long a terminal patient has left is difficult. The researchers asked ten clinicians and scientists to view one hundred portraits of people who were receiving palliative radiotherapy, and to guess whether each person would be alive within months. Even when the panel knew chronological age and cancer type, their accuracy barely beat chance. Adding FaceAge shifted the odds. With the AI number in hand, the group's predictions improved markedly, suggesting FaceAge captures hidden signals that physicians miss. The investigators began with public image banks that held 58,851 faces, each tagged with an age. Those photos came from everyday contexts, so the network first learned to recognize normal aging. Next the team applied FaceAge to clinical snapshots taken during routine treatment setup. Linking those pictures to medical records let the algorithm discover how appearance, disease, and outcome connect. FaceAge still needs validation in larger and more diverse populations. The study cohort came from only two centers, and lighting or camera angles could skew results. Cosmetic surgery, heavy makeup, or cultural differences in skin care might also confuse the model. The team plans to follow patients over time to see whether the FaceAge number changes as therapy progresses. "This opens the door to a whole new realm of biomarker discovery from photographs, and its potential goes far beyond cancer care or predicting age," said co-senior author Ray Mak, a faculty member in the AIM program at Mass General Brigham. "As we increasingly think of different chronic diseases as diseases of aging, it becomes even more important to be able to accurately predict an individual's aging trajectory." "I hope we can ultimately use this technology as an early detection system in a variety of applications, within a strong regulatory and ethical framework, to help save lives." FaceAge illustrates how artificial intelligence can turn everyday data into medical guidance. A single selfie may soon complement blood tests and scans, giving oncologists a faster, less biased picture of a patient's resilience. If further trials confirm the findings, clinics could upload photos and receive instantaneous biological age estimates that fine-tune treatment plans. Aging underlies heart disease, diabetes, dementia, and more. An image-based biomarker could therefore aid many specialties by identifying people who need lifestyle changes or preventive therapy years before symptoms appear. The key will be strict oversight. Algorithms trained on limited datasets risk embedding bias, and patient consent will be vital when personal images feed predictive models. For now, FaceAge remains a research AI tool. Yet its promise is clear: the face you present to the camera may hold clues to how your body is coping with illness and time. Harnessed responsibly, that knowledge could guide decisions that extend life and improve its quality - one snapshot at a time. Like what you read? Subscribe to our newsletter for engaging articles, exclusive content, and the latest updates.
[11]
FaceAge: the AI tool that can tell your biological age through one photo
What if a simple selfie was enough to show scientifically how well or badly we're ageing? That moment's getting closer ... So, it will tell me when I'll die? No thanks. Wait, I haven't even explained it yet. Doesn't matter, it's still the most terrifying thing I've ever heard. No, give it a chance. FaceAge is only doing what doctors already do. Which is what? Visually assessing you to obtain a picture of your health. Oh, that doesn't sound so bad. But FaceAge can do it much more accurately, to the point that it can predict whether or not you'll survive treatment. No, I'm out again. I'll explain more. FaceAge is an AI tool created by scientists at Mass General Brigham in Boston. By looking at a photo of your face, it can determine your biological age as opposed to your chronological age. What does that mean? It means that everyone ages at different speeds. At the age of 50, for example, Paul Rudd had a biological age of 43, according to researchers. But at the same age, fellow actor Wilford Brimley had a biological age of 69. And why does this matter? People with older biological ages are less likely to tolerate an aggressive treatment such as radiotherapy. Repeat all that as if I'm an idiot. OK. The older your face looks, the worse things are for you. Great news for the prematurely grey, then. Actually, no. Things like grey hair and baldness are often red herrings. FaceAge can give a better picture of someone's health by assessing the skin folds on your mouth or the hollowing of your temples. Right, I'll just be off to obsessively scrutinise the state of my temples. No, this is a good thing. A diagnostic tool like this, used properly, could improve the quality of life of millions of people. Although the initial research was confined to cancer patients, scientists plan to test FaceAge with other conditions. I've recently had plastic surgery. Will FaceAge still work on me? Unsure, actually. The creators still need to check that. And what about people of colour? Ah, yes, about that. The model was primarily trained on white faces, so there's no real telling how well it can adapt to other skin tones. This is starting to sound dodgy. Just teething problems. Look how fast AI can improve. Last year, ChatGPT was a useless novelty. Now it's going to destroy almost every labour market on Earth. You'd have to assume that FaceAge will rapidly improve as well. That's reassuring. Yes. Before we know it, it'll be scanning your face and instantly making a chillingly objective judgment call on whether you deserve to live or die.
[12]
Selfies can be used to predict cancer patients' survival rate
Selfies could predict a person's chance of surviving cancer, a study has suggested. Doctors believe a new artificial intelligence tool that measures the "biological age" of a patient based on a photo of their face, could inform the type of cancer treatment they receive. Knowing someone's biological age, rather than their actual age, is a better predictor of someone's overall health and life expectancy, a team from Mass General Brigham, a non-profit research group in the United States claims. The FaceAge AI tool scans an image of a person's face to estimate their biological age, which is based on factors including lifestyle and genetics. It is akin to what doctors call an "eyeball test", in which doctors make judgements about overall health based on appearance, which in turn informs decisions about whether a person is strong or fit enough to undergo intensive cancer treatment. Researchers said they wanted to see whether they could "go beyond" the "subjective and manual" eyeball test by creating a "deep learning" AI tool that could assess "simple selfies". The new AI algorithm was trained using 59,000 photos. Dr Hugo Aerts, one of the authors, said it was the first study to show "we can really use AI to turn a selfie into a real biomarker source of ageing". He said the tool is low cost, can be used repeatedly over time and could be used to track an individual's biological age over "months, years and decades". "The impact can be very large, because we now have a way to actually very easily monitor a patient's health status continuously and this could help us to better predict the risk of death or complications after, say, for example, a major surgery or other treatments," he added. Explaining the tool, academics showed how it assessed the biological age of actors Paul Rudd and Wilford Brimley based on photographs of the men when they were both 50 years old. Mr Rudd's biological age was calculated to be 42.6, while Mr Brimley, who died in 2020, was assessed to have a biological age of 69.
[13]
Scientists Claim AI Can Tell Cancer Patients Their Odds of Living by Looking At Their Selfies
Some of us look old for our age, while others look younger. These differences, though, may not just be superficial. Our appearances, youthful or seasoned, could actually be an accurate reflection of what scientists call our "biological age," a form of measuring someone's age by the health of their body's cells, as opposed to counting their years since birth. Exploring this, a team of scientists at Mass General Brigham (MGB) have now developed an AI model that they claim can estimate the biological age of cancer patients simply by analyzing photos of their faces, the New York Times reports, in what could be a game-changing tool in cancer treatment. Huge questions remain, however, about the technology's reliability and its fraught ethical implications. All the same, the AI, dubbed FaceAge, has led to some intriguing findings. As detailed in a new study published in the journal Lancet Digital Health, the researchers found that participants whose faces were judged to be younger by the AI model tended to do better after cancer treatment than those who were judged to be older. Overall, the participants who were suffering from cancer appeared to be five years older than their chronological age, while non-sufferers exhibited a biological age closer to their chronological age. With cancer patients, the AI accurately predicted that those with an older biological age were more likely to die. Doctors could use the tool, the researchers hypothesize, to help determine the best-suited treatment. A spry 75-year-old with a biological age of 65, in an example given by Agence France-Presse, could benefit from aggressive radiation therapy. But that course might be too risky for another man with as many years but a higher biological age. The AI model was trained on nearly 59,000 portraits of adults over 60 taken from public data sets, including sources like IMDB and Wikipedia. Then, to cut its teeth, the researchers had the model estimate the age of the study's roughly 6,200 cancer patients. One of the most surprising finds was that the AI didn't rely as much on what we typically consider signs of aging, like baldness or wrinkles. It placed more stock in subtler clues like facial muscle tone, per the AFP. There's still a long way to go before FaceAge is ready to be used by doctors, with plenty of thorny questions that need addressing. The AI was trained on mostly white faces, which could lead to racial biases in its analyses. It also remains to be seen how makeup, plastic surgery, or even just a change in lighting could foil the AI's predictions. "I'd be very worried about whether this tool works equally well for all populations, for example women, older adults, racial and ethnic minorities, those with various disabilities, pregnant women and the like," Jennifer Miller, co-director of the program for biomedical ethics at Yale University, told the NYT. And in our age of invasive surveillance and dwindling privacy, having a tool that purportedly unearths some secret facet of your biology by scanning your face can feel like another intrusion on our autonomy. What if insurers use a model like FaceAge to justify denying health coverage? "It is for sure something that needs attention, to assure that these technologies are used only in the benefit for the patient," study co-lead author Hugo Aerts, director of MGB's AI in Medicine program, told the AFP.
[14]
AI tool uses selfies to predict biological age and cancer survival
Washington (AFP) - Doctors often start exams with the so-called "eyeball test" -- a snap judgment about whether the patient appears older or younger than their age, which can influence key medical decisions. That intuitive assessment may soon get an AI upgrade. FaceAge, a deep learning algorithm described Thursday in The β―Lancet Digital Health, converts a simple headshot into a number that more accurately reflects a person's biological age rather than the birthday on their chart. Trained on tens of thousands of photographs, it pegged cancer patients on average as biologically five years older than healthy peers. The study's authors say it could help doctors decide who can safely tolerate punishing treatments, and who might fare better with a gentler approach. "We hypothesize that FaceAge could be used as a biomarker in cancer care to quantify a patient's biological age and help a doctor make these tough decisions," said co-senior author Raymond Mak, an oncologist at Mass Brigham Health, a Harvard-affiliated health system in Boston. Consider two hypothetical patients: a spry 75βyearβold whose biological age clocks in at 65, and a frail 60βyearβold whose biology reads 70. Aggressive radiation might be appropriate for the former but risky for the latter. The same logic could help guide decisions about heart surgery, hip replacements or end-of-life care. Sharper lens on frailty Growing evidence shows humans age at different rates, shaped by genes, stress, exercise, and habits like smoking or drinking. While pricey genetic tests can reveal how DNA wears over time, FaceAge promises insight using only a selfie. The model was trained on 58,851 portraits of presumed-healthy adults over 60, culled from public datasets. It was then tested on 6,196 cancer patients treated in the United States and the Netherlands, using photos snapped just before radiotherapy. Patients with malignancies looked on average 4.79 years older biologically than their chronological age. Among cancer patients, a higher FaceAge score strongly predicted worse survival -- even after accounting for actual age, sex, and tumor type -- and the hazard rose steeply for anyone whose biological reading tipped past 85. Intriguingly, FaceAge appears to weigh the signs of aging differently than humans do. For example, being gray-haired or balding matters less than subtle changes in facial muscle tone. FaceAge boosted doctors' accuracy, too. Eight physicians were asked to examine headshots of terminal cancer patients and guess who would die within six months. Their success rate barely beat chance; with FaceAge data in hand, predictions improved sharply. The model even affirmed a favorite internet meme, estimating actor Paul Rudd's biological age as 43 in a photo taken when he was 50. Bias and ethics guardrails AI tools have faced scrutiny for underβserving non-white people. Mak said preliminary checks revealed no significant racial bias in FaceAge's predictions, but the group is training a secondβgeneration model on 20,000 patients. They're also probing how factors like makeup, cosmetic surgery or room lighting variations could fool the system. Ethics debates loom large. An AI that can read biological age from a selfie could prove a boon for clinicians, but also tempting for life insurers or employers seeking to gauge risk. "It is for sure something that needs attention, to assure that these technologies are used only in the benefit for the patient," said Hugo Aerts, the study's co-lead who directs MGB's AI in medicine program. Another dilemma: What happens when the mirror talks back? Learning that your body is biologically older than you thought may spur healthy changes -- or sow anxiety. The researchers are planning to open a public-facing FaceAge portal where people can upload their own pictures to enroll in a research study to further validate the algorithm. Commercial versions aimed at clinicians may follow, but only after more validation.
[15]
Scientists develop tool to predict biological age using just a selfie
The tool has some limitations, but could eventually be used to help predict health outcomes, the researchers said. It's no secret that people age at different rates, with stress, smoking, genetics, and other factors all making themselves plain on our faces. Now, a new tool powered by artificial intelligence (AI) may be able to tell how quickly you're ageing, using only a selfie - not to insult or flatter you, but to assess your health. For a new study published in The Lancet Digital Health journal, researchers used photos of nearly 59,000 faces to train an AI model to estimate people's biological ages, or their age based on their cellular health rather than their birth date. Then they took the model, called "FaceAge," to about 6,200 cancer patients. On average, cancer patients looked about five years older than their actual ages, and they tended to have higher FaceAge readings than people without cancer, the study found. Notably, the model also helped doctors make better predictions about the short-term life expectancies of cancer patients receiving palliative care. Only the best physicians' predictions compared to FaceAge alone on accuracy. "How old someone looks compared to their chronological age really matters," Hugo Aerts, one of the study's authors and director of the AI in medicine programme at Mass General Brigham in the US, said in a statement. The researchers said that eventually, the tool could help doctors and cancer patients make decisions about end-of-life care - but that it could also be used to address a host of other health issues. Dr Ray Mak, one of the study authors and a cancer physician at Mass General Brigham, said FaceAge could someday be used as an "early detection system" for poor health. "As we increasingly think of different chronic diseases as diseases of ageing, it becomes even more important to be able to accurately predict an individual's ageing trajectory," Mak said in a statement. The tool has some limitations. It was primarily trained on white people, and it's not clear how factors that affect people's appearances - like lighting or make-up - could shape the results. The researchers are now expanding their work to include more hospitals and cancer patients at different stages of the disease, as well as testing FaceAge's accuracy against datasets with plastic surgery and make-up. Actually seeing a tool like FaceAge used in the doctor's office is a long way away. But Mak said it "opens the door to a whole new realm of biomarker discovery from photographs".
[16]
Can a photograph reveal your biological age?
It's no secret that some people appear to age faster than others, especially after enduring stressful periods. But some scientists think a person's physical appearance could reveal more about them than meets the eye -- down to the health of their tissues and cells, a concept known as "biological age." In a new study, published Thursday in The Lancet Digital Health, researchers trained artificial intelligence to estimate the biological ages of adults with cancer by analyzing photos of their faces. Study participants with younger estimates tended to fare better after treatment than those deemed older by AI, researchers at Mass General Brigham found. The findings suggest that people's biological age estimates are closely linked to their physical health, which could reflect their ability to survive certain treatments, the authors of the study said. And in the future, facial age analysis may become more useful than age alone in helping doctors make tough calls about their patients' treatment, they added. Face-based aging tools have "extraordinary potential" to help doctors quickly and inexpensively estimate how healthy their patients are, compared with existing tests, which use blood or saliva to measure chemical and molecular changes associated with aging, said William Mair, a professor of molecular metabolism at the Harvard T.H. Chan School of Public Health who was not involved in the study. While doctors usually visually estimate how healthy their patients are for their age, a tool like this could draw in much more data to make a better estimate, he added. FaceAge, the machine learning tool created by researchers at Mass General Brigham, found that study subjects with cancer appeared five years older than their chronological age. The biological age of people without cancer was typically close to their actual age. And those who were categorized as older were more likely to die, either from cancer or other causes. The researchers are not the first to find a link between facial and biological aging: A study in Denmark found that subjects who looked older than their chronological age tended to die earlier than their twins, and other studies have come to similar conclusions. FaceAge was trained on a database of more than 56,000 images of people age 60 and older, mostly sourced from Wikipedia and the movie database IMDB. The researchers then asked it to assess the age of study participants, most of whom had cancer, using photographs alone. Doctors could one day use FaceAge to decide whether to provide different treatment depending on a patient's estimated biological age, said Dr. Raymond H. Mak, a radiation oncologist at Mass General Brigham who worked on the study. Toni Feather, 69, a hairdresser and a cancer patient under Mak's care, was one of the study participants who looked younger than her chronological age. Feather, who lives in Upton, Massachusetts, said Mak explained that her appearance -- which was roughly 10 years younger than her age -- could reflect biological resilience, which may have helped her withstand grueling treatments. (Feather has undergone several rounds of surgery, chemotherapy and radiation for lung cancer, but she continues to work once a week and regularly cares for her young grandson.) Preliminary data suggests that FaceAge goes beyond the visual markers of age we might look to, like wrinkles, gray hair or baldness, and instead flags less obvious factors like hollowing of the temples (which reflects a loss of muscle mass) and the prominence of the skin folds on either side of the mouth, Mak said. The authors of the study hope to eventually commercialize the technology and create a product that could be used in doctor's offices. They plan to file for a patent once the technology is more developed. The current version of the tool has limitations. It was primarily trained on white faces, Mak said, so it could work differently for people with different skin tones. And it isn't clear to what extent modifications like plastic surgery, makeup, lighting or the angle of the face could affect the results. And while biological aging can be accelerated by a number of factors, such as stress, pregnancy, smoking, drinking alcohol and even extreme heat, some of these changes can be reversible -- and it's not clear if the tool would pick up those changes over time. Experts in medical ethics also have concerns. "I'd be very worried about whether this tool works equally well for all populations, for example women, older adults, racial and ethnic minorities, those with various disabilities, pregnant women and the like," said Jennifer E. Miller, the co-director of the program for biomedical ethics at Yale University. She and others in the field also wondered whether the tool might be used to justify denying insurance coverage or medical treatment. Mak and other researchers who worked on the study have had reservations, too. "We're really concerned about potential misuse of technology in general," he said. However, he added, the researchers felt the tool would be more helpful than harmful -- and it could be used to support, but not replace, clinicians' judgment. It's unclear whether FaceAge's results will be more accurate, more scalable or cheaper than the results from existing tools for estimating biological age, said Daniel Belsky, a Columbia University epidemiologist and associate professor who co-led the development of DunedinPACE, a widely used epigenetic clock. "There's a long way between where we are today and actually using these tools in a clinical setting," Belsky said.
[17]
New AI tool predicts your biological age from a selfie
Our faces suggest our true age and even how much time we may have left on Earth. While doctors learn to form a picture of a patient's health from their face, using what they call "the eyeball test," new research in the Lancet Digital Health indicates that this may be a job that artificial intelligence can enhance in the future. Scientists at Mass General Brigham in Boston have developed and carried out some initial testing of an AI tool called FaceAge, an algorithm designed to tell patients' biological age from a photograph as simple as a selfie -- not how old they are in years, but how old they are in health. Biological age is considered crucial in helping doctors determine the most appropriate therapy, such as whether a cancer patient is healthy enough to tolerate an aggressive treatment. FaceAge requires more testing before doctors can begin using it routinely, but scientists said that in the next week or two, they expect to begin enrolling about 50 patients in a pilot study. Researchers said they trained FaceAge using about 59,000 photographs of people ages 60 and older who were presumed to be healthy. Most of the photos were publicly available on Wikipedia and the internet movie database IMDb, while some came from UTKFace, a large-scale dataset with pictures of people from less than a year old to 116 years old. The developers of FaceAge tested it on a group of 6,200 cancer patients, using photographs taken at the start of radiotherapy treatment. The algorithm determined that when it came to their health, cancer patients were, on average, about five years older than their chronological age. Moreover, the tool found that the older their faces looked, the worse their survival outlook. Scientists then conducted an experiment in which they asked eight doctors to determine whether terminal cancer patients would be alive in six months based first on the patient's photograph alone, then on the photograph and clinical information, and finally based on FaceAge and clinical information. "We found that doctors on average can predict life expectancy with an accuracy that's only a little better than a coin flip," when using a photo alone for their analysis, said Raymond Mak, a radiation oncologist at Mass General Brigham and one of the lead investigators for the study. When using a photo alone, the doctors were right about 61 percent of the time. Given photos and clinical information on the patients, they were right about 74 percent of the time. Provided with FaceAge and medical chart information, the doctors' accuracy reached 80 percent. In a news conference last week, Mak said he tested FaceAge using the photograph of a patient whom he first met four years ago, an 86-year-old man with terminal lung cancer. "Some doctors would hesitate to offer cancer treatment to someone in their late 80s or 90s with the rationale that the patient may die of other causes before the cancer progresses and becomes life-threatening," Mak said. "But he looked younger than 86 to me, and based on the eyeball test and a host of other factors, I decided to treat him with aggressive radiation therapy." Several years later, when Mak ran the man's photograph through FaceAge, "we found he's more than 10 years younger than his chronological age." The patient is now 90, "and still doing great," Mak said. Researchers stressed that FaceAge is not intended to replace a doctor's assessment, but rather to provide an objective measure to help fill out a clinical picture. "I think it's really important to know that different people age at different rates, and, as they're showing here, that clearly seems to have a major effect on their actual prognosis," said Gary Schwartz, a scientist at University Health Network's Princess Margaret Cancer Center in Toronto, who was not involved in the research. Having a face with a lot of mileage on it, however, does not necessarily doom someone to an early grave. In a second demonstration of FaceAge, the developers had it analyze photographs of actors Paul Rudd and Wilford Brimley when each man was 50. The algorithm determined that Rudd's biological age was about 43; Brimley's was almost 69 (though he would live to 85). Irbaz Riaz, an assistant professor of medicine and senior associate consultant in the department of AI and informatics at Mayo Clinic, called FaceAge "a promising early-stage tool." While the tool does not replace a doctor's experience, said Riaz, who did not work on the study, "it could standardize the subtle visual assessments we make every day. That said, clinicians will need to understand how the model was trained, when it might be biased and where it could add value without overstepping its role." Nasim Eftekhari, vice president of applied AI and advanced analytics at City of Hope cancer treatment and research center in Duarte, California, who did not participate in the study, called the tool "an incremental improvement," saying, "if this goes through validation and bias testing and approvals, and all of that, this could be an additional biomarker, at best," to go with cancer stage, characteristics of the tumor and other factors. Developers of FaceAge acknowledged that should the technology be approved by the Food and Drug Administration, ethical guidelines will need to be established to govern its use and access to its information. "This technology can do a lot of good, but it could also potentially do some harm," said Hugo Aerts, director of the Artificial Intelligence in Medicine program at Mass General Brigham and another lead investigator on the FaceAge study. Aerts said hospitals have "very strong governance committees and regulatory guidelines that they have to adhere to [to] make sure these AI technologies are being used in the right way, really only for the benefit of the patients" and not for others, such as insurers. Privacy has been a key concern for earlier technologies that attempt to show how we might use age based on a photo, such as FaceApp. Although initial testing of FaceAge focused on cancer patients, the scientists plan also to measure its performance for other conditions. Aerts said that the tool still needs to be trained to deal with numerous variables that can affect a photograph of a face: lighting, makeup, skin tone and, of course, our attempts to look younger through plastic surgery. "So this is something that we are actively investigating and researching," Aerts said. "We're now testing in various datasets [to see] how we can make the algorithm robust against this." While the tool still has much to learn, we may have something to learn from it, too. "It is important to know that the algorithm looks at age differently than humans do," Aerts said. "So, for example, being bald or not, or being gray is less important in the algorithm than we actually initially thought." Mak said the scientists are still trying to figure out what features FaceAge focuses on in the photographs when it estimates biological age.
[18]
AI System Can Predict Cancer Survival Prognosis Better Than Doctors, Researchers Say | PYMNTS.com
FaceAge improved doctors' accuracy in predicting six-month survival for terminally ill patients. It is often a heart-stopping moment for patients when they hear their doctor's prognosis that they have cancer. For late-stage cancers, especially, the question that often arises is, "How long do I have to live?" Doctors typically rely on experience and medical tests to make their best educated guess. Depending on the prediction, a series of treatments are recommended. However, hospital researchers affiliated with Harvard Medical School are taking the guesswork out of this crucial prognosis. They developed FaceAge, an artificial intelligence system that analyzes a photograph to more accurately estimate a patient's biological instead of chronological age. For example, a healthy 75-year-old person may have the physiological traits of some 60 years old. "We found that, on average, patients with cancer look approximately five years older than their chronological age and have a statistically higher FaceAge compared with clinical cohorts of patients without cancer who are treated for conditions that are benign or precancerous," the researchers wrote in their paper, which was published in The Lancet Digital Health. By correctly assessing the body's true age, survival predictions become more accurate, which determines what treatments to give patients, among other measures of medical care. "We showed that survival prediction performance of clinicians improved when FaceAge risk model predictions were made available," the paper said. FaceAge, which was developed using deep learning techniques, showed that patients who look older than their actual age are more likely to have worse outcomes, even after controlling for traditional clinical risk factors. "Looking older was correlated with worse overall survival...," according to the authors, which included Canadian and European researchers. For patients, FaceAge represents a future in which one photograph could provide personalized insights into health, risk and treatment decisions that can accompany lab tests and medical scans. For healthcare providers, FaceAge can complement their clinical judgment, which is especially crucial when treating seriously ill patients. This is particularly relevant for cancer, where the narrow window of survival often forces doctors to make difficult decisions about aggressive treatments based on their own prognosis. While AI is increasingly being used in medical settings, it cannot replace the crucial role that physicians and other caregivers play, healthcare experts told PYMNTS. However, AI tools can be an important complement to ensure the patient gets a seamless digital experience, according to the PYMNTS Intelligence report "The Digital Healthcare Gap: Streamlining The Patient Journey." The FaceAge AI model was trained on nearly 59,000 images of healthy individuals aged 60 or older and tested on 6,200 cancer patients from the United States and the Netherlands. Using a two-stage neural network system, the algorithm detects a face in a photo, extracts key features and generates an estimated biological age, per the paper. The tool was better at predicting the length of survival than looking at the patients' chronological age among three groups: those receiving curative radiotherapy, those with thoracic cancers, and those receiving palliative care for metastatic disease. Patients with cancer had a FaceAge that was five years older on average than their actual age, a statistically significant gap. FaceAge also improved survival predictions for terminally ill patients. When used alongside the TEACHH clinical model, a tool used to estimate life expectancy in patients undergoing palliative radiotherapy, FaceAge boosted the model's accuracy in predictions, the paper said. Physicians also performed better at predicting six-month survivals when aided by FaceAge, according to the paper. This could have a major impact on treatment decisions, helping clinicians weigh the pros and cons of therapy in patients nearing the end of life. Despite its promise, the system raises important ethical considerations. The researchers acknowledged risks, including potential misuse by insurers or advertisers and racial or socioeconomic bias in the model. Although FaceAge showed minimal bias across ethnic groups in preliminary testing, the researchers called for further validation using more diverse datasets and careful regulatory oversight, saying in the paper that "further assessments of bias in performance across different populations will be essential." While FaceAge is not yet ready for routine clinical use, its success marks a step toward integrating AI-based biomarkers into healthcare. It suggests that something as simple as a patient's face may soon hold the key to more precise, humane and personalized care. Going forward, the researchers said in the paper that testing with larger groups and further research are needed to "establish whether the findings extend to patients with other diseases."
Share
Copy Link
Researchers at Mass General Brigham have developed an AI tool called FaceAge that can estimate a person's biological age from facial photographs and predict survival outcomes for cancer patients, potentially aiding in clinical decision-making.
Researchers at Mass General Brigham have developed an innovative AI tool called FaceAge that can estimate a person's biological age from facial photographs. This deep learning algorithm has shown promising results in predicting survival outcomes for cancer patients, potentially revolutionizing clinical decision-making 12.
FaceAge was trained on 58,851 photos of presumed healthy individuals from public datasets. The algorithm leverages deep learning and facial recognition technologies to analyze facial features and estimate biological age 13.
The tool could potentially help physicians make more informed decisions about treatment plans for cancer patients. By providing an objective measure of biological age, FaceAge may assist in tailoring the intensity of treatments like radiation and chemotherapy to individual patients 45.
While promising, FaceAge is not yet ready for clinical use. The researchers acknowledge several limitations:
The research team is conducting follow-up studies to expand the work across different hospitals, examine patients at various cancer stages, and track FaceAge estimates over time 13.
The researchers emphasize the need for ethical guidelines surrounding the use of FaceAge information. Concerns include potential misuse by health or life insurance providers in making coverage decisions 45.
Beyond cancer care, the technology shows potential for predicting diseases, general health status, and lifespan. Researchers hope to eventually use this technology as an early detection system for various health conditions, within a strong regulatory and ethical framework 135.
As this technology continues to develop, it could open new doors in precision medicine, offering a non-invasive method to assess biological age and health status. However, further research and careful consideration of ethical implications will be crucial before implementing such tools in clinical settings.
Anthropic launches Claude 4 Opus and Sonnet models, showcasing significant advancements in AI coding, reasoning, and long-term task execution. The new models boast improved performance on benchmarks and introduce features like extended thinking with tool use.
27 Sources
Technology
4 hrs ago
27 Sources
Technology
4 hrs ago
OpenAI, in partnership with major tech companies, announces the expansion of its Stargate project to the UAE, planning a 1GW data center cluster in Abu Dhabi as part of a larger 5GW initiative.
12 Sources
Technology
4 hrs ago
12 Sources
Technology
4 hrs ago
New research reveals the alarming rise in energy consumption by AI systems, potentially doubling data center power demand by year-end and posing significant challenges to tech companies' climate goals.
5 Sources
Technology
20 hrs ago
5 Sources
Technology
20 hrs ago
Google announces plans to incorporate advertisements into its AI Mode and expand ad presence in AI Overviews, signaling a significant shift in how AI-powered search results are monetized.
8 Sources
Technology
20 hrs ago
8 Sources
Technology
20 hrs ago
Apple plans to release AI-enabled smart glasses by the end of 2026, featuring cameras, microphones, and speakers. The device aims to compete with Meta's Ray-Ban smart glasses and marks Apple's entry into the AI wearables market.
12 Sources
Technology
4 hrs ago
12 Sources
Technology
4 hrs ago