The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved
Curated by THEOUTPOST
On Fri, 4 Oct, 12:02 AM UTC
3 Sources
[1]
AI Enhances Early Detection of Health Issues - Neuroscience News
Summary: Penn AInSights, an AI-based imaging system, enhances radiology by creating precise 3D views of internal organs, enabling early detection of health issues like fatty liver disease and diabetes. By analyzing 2,000 scans per month, it helps clinicians screen for conditions beyond their primary focus. The system's seamless integration into existing workflows and its cost-effectiveness make it a game changer in healthcare, potentially extending lives through early intervention. Our imagination for artificial intelligence is expansive and ambitious. While there are plenty of dystopian tropes, pop culture is full of hopeful examples of what we believe artificial intelligence could bring to us, ranging from operating systems that cure loneliness to assistants that push the limits of humans' physiological capabilities. Maybe the most famous fictional AI is Roy Batty -- an impossibly strong soldier android (or "replicant" in the "Blade Runner" films). What has made the character enduring is less about his superhuman capabilities, and more about what hewed him closer to our most core desire: to live, and to live longer. At present, many consumer AI tools fall short of the potential imagined in science fiction. In the medical realm, though, an AI program now widely used at Penn Medicine can give us some of the life-sustaining help that Batty wanted most. Recently, Penn AInSights, an AI-guided imaging system that helps create a more-precise, three-dimensional view of internal organs, was named a CIO 100 winner for its work in the field of radiology. The program, at its root, is a clinical support tool for physicians, allowing them to look at images of people's livers, spleens, kidneys, and more to determine with some exactitude if the organs are showing any abnormal traits that could shorten lives. With this precise knowledge of whether a patient has developed something like fatty liver disease, or is showing warning signs of diabetes, or that their kidneys may fail in time, Penn Medicine's clinical staff can take steps to help patients sooner and more effectively than ever before, potentially adding years to their lives. "When you look at the liver you say, 'Okay, is this normal?'" said Charles Kahn, MD, MS, a professor of Radiology in the Perelman School of Medicine at the University of Pennsylvania. "You eyeball it and use some measurements to say whether it's big or small. It's kind of like when you look at someone and think whether they could play basketball or be a jockey at the Preakness. But, sometimes, it isn't as easy as that." From the very first X-rays, radiologic imaging has necessarily involved reconciling two-dimensional images with the space a body part takes up in physical reality. Measurements of length and width may not be perfect indicators of "big" or "small" on three-dimensional objects. "If your spleen is longer than 13 centimeters, that's considered big," said Kahn. "So if you've got a spleen the shape of a hot dog that's 15 centimeters long, that would be called enlarged. But the actual volume would make it small." That's where AInSights shows its value. Trained on thousands of images, the program can quickly analyze a huge amount of imaging and effectively build a digital 3D model of organs. From there, it can flag potential issues, right within the technological tools that clinicians already use every day. Built and thoroughly tested at Penn Medicine before it was brought to clinicians for fine-tuning to serve patient populations, AInSights was first developed with images gathered through researchers, and built in partnership with Penn Medicine IS teams well-versed in how doctors use technology to diagnose and treat patients. "The value of this, really, is about having an end-to-end pipeline of development, one of the successful times that Information Services, research, and clinical teams came together," said Ameena Elahi, MPA, RT(R), CIIP, an application manager in Penn Medicine Information Services (IS). AInSights has been particularly effective because of the holes in other products. "You look around and there are countless vendors selling AI solutions, with the vast majority in Radiology," said Walter Witschey, PhD, an associate professor Radiology, who helped build the AInSights program and has done research with it. "But despite that number and the huge interest in AI in medicine, they really haven't been adopted by hospitals in a huge way because there were simple problems of integration that were being overlooked by the vendors." The regular clinical pathway for interpreting radiological images has been for the images to be taken, manually reviewed by radiologists, then put into a report. AInSights was built as a supportive tool to sit, invisibly, atop that infrastructure and seamlessly integrate itself into the process -- while improving it. "The model looks at the images, generates AI annotations and quantifies the traits of what it's looking at -- that's given to the radiologist, all automatically," Witschey said. The process to build this has taken years, but the program has gotten exponentially better and is now used at Penn Medicine to analyze roughly 2,000 scans of the abdomen or chest a month. "We started off testing it and it was literally an hour to get the final product," Elahi remembered. "Then, quickly, we got it down to about 10 minutes. And, since, we've gotten it down a lot further so it is really clinically convenient." A Journal of Imaging Informatics in Medicine paper that came out in July (co-authored by Elahi, Kahn, Witschey, and others) showed that the "turnaround time" for CT scans of the abdomen was just 2.8 minutes. With this type of technology, clinicians are able to do some "opportunistic screening," said Kahn. Someone doing a CT scan to monitor a kidney condition can then also have their liver, spleen, pancreas, and the bottoms of their lungs screened for any extra issues. Human radiologists would obviously focus mainly on the kidney, but the AI could flag anything of note elsewhere. "There's a lot of information in the 400 to 500 images you end up looking at," Kahn said. "Some of these things are not detectable to the unaided eye, so having these tools really plays into that." This is allowing clinicians to get on top of conditions that could progress nearly invisibly until they become a serious problem. For instance, Witschey said that the program, by scanning pixel-level image data, can find patterns among imaging features such as the fattiness of the liver and create a predictive measure for whether someone has diabetes without needing the help of a typical diabetic panel of testing. This makes it easier to recommend follow-up diagnostic testing for those patients. Additionally, a program taking AInSights out of the abdomen and to the brain is working to assist radiologists in searching for dementia, such as in Alzheimer's disease, which is challenging because of how subtle the changes in imaging can be. "We can measure the size of the various parts of the brain and compare them to a large database of people who have normal brain imaging to see what parts of the brain have changed and how severe brain volume loss might be," said Ilya Nasrallah, MD, PhD, an associate professor of Radiology, who leads implementation of AI tools in the department. "We anticipate will add confidence to our assessment process in dementia screening and inform management of the condition." Of special importance to AInSights is the Penn Medicine BioBank. More than 40,000 people have had whole genome sequencing stored in the biobank, and tens of thousands have imaging attached to it, Kahn said. "That really helps us tease out what we call 'imaging phenotypes,' which we can work to connect with information about a person's genetics," Kahn said. All of this is toward creating a system that can quickly and easily help clinicians decide "what's normal, and what's not" and then decide the most effective course of action. "We want to develop simple things like a 'nomogram' for spleen volume, which would allow us to look at our patient population and say, 'This spleen is normal for a 33-year-old woman, but for a 70-year-old male patient, that's not right.'" Eventually, the hope is that AInSights can be used for the whole suite of imaging done at Penn Medicine, including for cancer, neuromuscular degeneration, and cardiovascular conditions. As it stands right now, the AInSights is extremely cost-effective to run: Less than a dollar per patient, and just about $700 per month at this large health system that sees a high volume of patients. That makes it appealing even beyond Penn Medicine. "We have had conversations about getting this into developing countries where AI support would be incredibly valuable," Witschey said. Having such a powerful system also allows for greater public health applications. Kahn said they're planning on looking at the distribution of "unrecognized kidney disease by zip codes" to better map out underdiagnoses by the social determinants of health. It's a step further in rooting out what was once unseen. At one point in "Blade Runner," Roy Batty encounters the man who made his artificial eyes. Batty's character is deeply motivated by the scope and wonder of what he has seen, far exceeding what most, if not all, natural, human eyes could see. He says to the manufacturer, "If only you could see what I've seen with your eyes."
[2]
AI-guided imaging system improves radiology precision
by Perelman School of Medicine at the University of Pennsylvania Our imagination for artificial intelligence is expansive and ambitious. While there are plenty of dystopian tropes, pop culture is full of hopeful examples of what we believe artificial intelligence could bring to us, ranging from operating systems that cure loneliness to assistants that push the limits of humans' physiological capabilities. Maybe the most famous fictional AI is Roy Batty -- an impossibly strong soldier android (or "replicant" in the "Blade Runner" films). What has made the character enduring is less about his superhuman capabilities, and more about what hewed him closer to our most core desire: to live, and to live longer. At present, many consumer AI tools fall short of the potential imagined in science fiction. In the medical realm, though, an AI program now widely used at Penn Medicine can give us some of the life-sustaining help that Batty wanted most. Recently, Penn AInSights, an AI-guided imaging system that helps create a more-precise, three-dimensional view of internal organs, was named a CIO 100 winner for its work in the field of radiology. The program, at its root, is a clinical support tool for physicians, allowing them to look at images of people's livers, spleens, kidneys, and more to determine with some exactitude if the organs are showing any abnormal traits that could shorten lives. With this precise knowledge of whether a patient has developed something like fatty liver disease, or is showing warning signs of diabetes, or that their kidneys may fail in time, Penn Medicine's clinical staff can take steps to help patients sooner and more effectively than ever before, potentially adding years to their lives. Normal or abnormal? "When you look at the liver you say, 'Okay, is this normal?'" said Charles Kahn, MD, MS, a professor of Radiology in the Perelman School of Medicine at the University of Pennsylvania. "You eyeball it and use some measurements to say whether it's big or small. It's kind of like when you look at someone and think whether they could play basketball or be a jockey at the Preakness. But, sometimes, it isn't as easy as that." From the very first X-rays, radiologic imaging has necessarily involved reconciling two-dimensional images with the space a body part takes up in physical reality. Measurements of length and width may not be perfect indicators of "big" or "small" on three-dimensional objects. "If your spleen is longer than 13 centimeters, that's considered big," said Kahn. "So if you've got a spleen the shape of a hot dog that's 15 centimeters long, that would be called enlarged. But the actual volume would make it small." That's where AInSights shows its value. Trained on thousands of images, the program can quickly analyze a huge amount of imaging and effectively build a digital 3D model of organs. From there, it can flag potential issues, right within the technological tools that clinicians already use every day. Built and thoroughly tested at Penn Medicine before it was brought to clinicians for fine-tuning to serve patient populations, AInSights was first developed with images gathered through researchers, and built in partnership with Penn Medicine IS teams well-versed in how doctors use technology to diagnose and treat patients. "The value of this, really, is about having an end-to-end pipeline of development, one of the successful times that Information Services, research, and clinical teams came together," said Ameena Elahi, MPA, RT(R), CIIP, an application manager in Penn Medicine Information Services (IS). AInSights has been particularly effective because of the holes in other products. "You look around and there are countless vendors selling AI solutions, with the vast majority in Radiology," said Walter Witschey, Ph.D., an associate professor of Radiology, who helped build the AInSights program and has done research with it. "But despite that number and the huge interest in AI in medicine, they really haven't been adopted by hospitals in a huge way because there were simple problems of integration that were being overlooked by the vendors." The regular clinical pathway for interpreting radiological images has been for the images to be taken, manually reviewed by radiologists, then put into a report. AInSights was built as a supportive tool to sit, invisibly, atop that infrastructure and seamlessly integrate itself into the process -- while improving it. "The model looks at the images, generates AI annotations and quantifies the traits of what it's looking at -- that's given to the radiologist, all automatically," Witschey said. The process to build this has taken years, but the program has gotten exponentially better and is now used at Penn Medicine to analyze roughly 2,000 scans of the abdomen or chest a month. "We started off testing it and it was literally an hour to get the final product," Elahi remembered. "Then, quickly, we got it down to about 10 minutes. And, since, we've gotten it down a lot further so it is really clinically convenient." A paper published in the Journal of Imaging Informatics in Medicine in July (co-authored by Elahi, Kahn, Witschey, and others) showed that the "turnaround time" for CT scans of the abdomen was just 2.8 minutes. Expanding opportunities With this type of technology, clinicians are able to do some "opportunistic screening," said Kahn. Someone doing a CT scan to monitor a kidney condition can then also have their liver, spleen, pancreas, and the bottoms of their lungs screened for any extra issues. Human radiologists would obviously focus mainly on the kidney, but the AI could flag anything of note elsewhere. "There's a lot of information in the 400 to 500 images you end up looking at," Kahn said. "Some of these things are not detectable to the unaided eye, so having these tools really plays into that." This is allowing clinicians to get on top of conditions that could progress nearly invisibly until they become a serious problem. For instance, Witschey said that the program, by scanning pixel-level image data, can find patterns among imaging features such as the fattiness of the liver and create a predictive measure for whether someone has diabetes without needing the help of a typical diabetic panel of testing. This makes it easier to recommend follow-up diagnostic testing for those patients. Additionally, a program taking AInSights out of the abdomen and to the brain is working to assist radiologists in searching for dementia, such as in Alzheimer's disease, which is challenging because of how subtle the changes in imaging can be. "We can measure the size of the various parts of the brain and compare them to a large database of people who have normal brain imaging to see what parts of the brain have changed and how severe brain volume loss might be," said Ilya Nasrallah, MD, Ph.D., an associate professor of Radiology, who leads implementation of AI tools in the department. "We anticipate will add confidence to our assessment process in dementia screening and inform management of the condition." Of special importance to AInSights is the Penn Medicine BioBank. More than 40,000 people have had whole genome sequencing stored in the biobank, and tens of thousands have imaging attached to it, Kahn said. "That really helps us tease out what we call 'imaging phenotypes,' which we can work to connect with information about a person's genetics," Kahn said. All of this is toward creating a system that can quickly and easily help clinicians decide "what's normal, and what's not" and then decide the most effective course of action. "We want to develop simple things like a 'nomogram' for spleen volume, which would allow us to look at our patient population and say, 'This spleen is normal for a 33-year-old woman, but for a 70-year-old male patient, that's not right.'" The future Eventually, the hope is that AInSights can be used for the whole suite of imaging done at Penn Medicine, including for cancer, neuromuscular degeneration, and cardiovascular conditions. As it stands right now, the AInSights is extremely cost-effective to run: Less than a dollar per patient, and just about $700 per month at this large health system that sees a high volume of patients. That makes it appealing even beyond Penn Medicine. "We have had conversations about getting this into developing countries where AI support would be incredibly valuable," Witschey said. Having such a powerful system also allows for greater public health applications. Kahn said they're planning on looking at the distribution of "unrecognized kidney disease by zip codes" to better map out underdiagnoses by the social determinants of health. It's a step further in rooting out what was once unseen. At one point in "Blade Runner," Roy Batty encounters the man who made his artificial eyes. Batty's character is deeply motivated by the scope and wonder of what he has seen, far exceeding what most, if not all, natural, human eyes could see. He says to the manufacturer, "If only you could see what I've seen with your eyes."
[3]
From replicant's dream to reality: Imaging AI exte | Newswise
Our imagination for artificial intelligence is expansive and ambitious. While there are plenty of dystopian tropes, pop culture is full of hopeful examples of what we believe artificial intelligence could bring to us, ranging from operating systems that cure loneliness to assistants that push the limits of humans' physiological capabilities. Maybe the most famous fictional AI is Roy Batty -- an impossibly strong soldier android (or "replicant" in the "Blade Runner" films). What has made the character enduring is less about his superhuman capabilities, and more about what hewed him closer to our most core desire: to live, and to live longer. At present, many consumer AI tools fall short of the potential imagined in science fiction. In the medical realm, though, an AI program now widely used at Penn Medicine can give us some of the life-sustaining help that Batty wanted most. Recently, Penn AInSights, an AI-guided imaging system that helps create a more-precise, three-dimensional view of internal organs, was named a CIO 100 winner for its work in the field of radiology. The program, at its root, is a clinical support tool for physicians, allowing them to look at images of people's livers, spleens, kidneys, and more to determine with some exactitude if the organs are showing any abnormal traits that could shorten lives. With this precise knowledge of whether a patient has developed something like fatty liver disease, or is showing warning signs of diabetes, or that their kidneys may fail in time, Penn Medicine's clinical staff can take steps to help patients sooner and more effectively than ever before, potentially adding years to their lives. "When you look at the liver you say, 'Okay, is this normal?'" said Charles Kahn, MD, MS, a professor of Radiology in the Perelman School of Medicine at the University of Pennsylvania. "You eyeball it and use some measurements to say whether it's big or small. It's kind of like when you look at someone and think whether they could play basketball or be a jockey at the Preakness. But, sometimes, it isn't as easy as that." From the very first X-rays, radiologic imaging has necessarily involved reconciling two-dimensional images with the space a body part takes up in physical reality. Measurements of length and width may not be perfect indicators of "big" or "small" on three-dimensional objects. "If your spleen is longer than 13 centimeters, that's considered big," said Kahn. "So if you've got a spleen the shape of a hot dog that's 15 centimeters long, that would be called enlarged. But the actual volume would make it small." That's where AInSights shows its value. Trained on thousands of images, the program can quickly analyze a huge amount of imaging and effectively build a digital 3D model of organs. From there, it can flag potential issues, right within the technological tools that clinicians already use every day. Built and thoroughly tested at Penn Medicine before it was brought to clinicians for fine-tuning to serve patient populations, AInSights was first developed with images gathered through researchers, and built in partnership with Penn Medicine IS teams well-versed in how doctors use technology to diagnose and treat patients. "The value of this, really, is about having an end-to-end pipeline of development, one of the successful times that Information Services, research, and clinical teams came together," said Ameena Elahi, MPA, RT(R), CIIP, an application manager in Penn Medicine Information Services (IS). AInSights has been particularly effective because of the holes in other products. "You look around and there are countless vendors selling AI solutions, with the vast majority in Radiology," said Walter Witschey, PhD, an associate professor Radiology, who helped build the AInSights program and has done research with it. "But despite that number and the huge interest in AI in medicine, they really haven't been adopted by hospitals in a huge way because there were simple problems of integration that were being overlooked by the vendors." The regular clinical pathway for interpreting radiological images has been for the images to be taken, manually reviewed by radiologists, then put into a report. AInSights was built as a supportive tool to sit, invisibly, atop that infrastructure and seamlessly integrate itself into the process -- while improving it. "The model looks at the images, generates AI annotations and quantifies the traits of what it's looking at -- that's given to the radiologist, all automatically," Witschey said. The process to build this has taken years, but the program has gotten exponentially better and is now used at Penn Medicine to analyze roughly 2,000 scans of the abdomen or chest a month. "We started off testing it and it was literally an hour to get the final product," Elahi remembered. "Then, quickly, we got it down to about 10 minutes. And, since, we've gotten it down a lot further so it is really clinically convenient." A Journal of Imaging Informatics in Medicine paper that came out in July (co-authored by Elahi, Kahn, Witschey, and others) showed that the "turnaround time" for CT scans of the abdomen was just 2.8 minutes. With this type of technology, clinicians are able to do some "opportunistic screening," said Kahn. Someone doing a CT scan to monitor a kidney condition can then also have their liver, spleen, pancreas, and the bottoms of their lungs screened for any extra issues. Human radiologists would obviously focus mainly on the kidney, but the AI could flag anything of note elsewhere. "There's a lot of information in the 400 to 500 images you end up looking at," Kahn said. "Some of these things are not detectable to the unaided eye, so having these tools really plays into that." This is allowing clinicians to get on top of conditions that could progress nearly invisibly until they become a serious problem. For instance, Witschey said that the program, by scanning pixel-level image data, can find patterns among imaging features such as the fattiness of the liver and create a predictive measure for whether someone has diabetes without needing the help of a typical diabetic panel of testing. This makes it easier to recommend follow-up diagnostic testing for those patients. Additionally, a program taking AInSights out of the abdomen and to the brain is working to assist radiologists in searching for dementia, such as in Alzheimer's disease, which is challenging because of how subtle the changes in imaging can be. "We can measure the size of the various parts of the brain and compare them to a large database of people who have normal brain imaging to see what parts of the brain have changed and how severe brain volume loss might be," said Ilya Nasrallah, MD, PhD, an associate professor of Radiology, who leads implementation of AI tools in the department. "We anticipate will add confidence to our assessment process in dementia screening and inform management of the condition." Of special importance to AInSights is the Penn Medicine BioBank. More than 40,000 people have had whole genome sequencing stored in the biobank, and tens of thousands have imaging attached to it, Kahn said. "That really helps us tease out what we call 'imaging phenotypes,' which we can work to connect with information about a person's genetics," Kahn said. All of this is toward creating a system that can quickly and easily help clinicians decide "what's normal, and what's not" and then decide the most effective course of action. "We want to develop simple things like a 'nomogram' for spleen volume, which would allow us to look at our patient population and say, 'This spleen is normal for a 33-year-old woman, but for a 70-year-old male patient, that's not right.'" Eventually, the hope is that AInSights can be used for the whole suite of imaging done at Penn Medicine, including for cancer, neuromuscular degeneration, and cardiovascular conditions. As it stands right now, the AInSights is extremely cost-effective to run: Less than a dollar per patient, and just about $700 per month at this large health system that sees a high volume of patients. That makes it appealing even beyond Penn Medicine. "We have had conversations about getting this into developing countries where AI support would be incredibly valuable," Witschey said. Having such a powerful system also allows for greater public health applications. Kahn said they're planning on looking at the distribution of "unrecognized kidney disease by zip codes" to better map out underdiagnoses by the social determinants of health. It's a step further in rooting out what was once unseen. At one point in "Blade Runner," Roy Batty encounters the man who made his artificial eyes. Batty's character is deeply motivated by the scope and wonder of what he has seen, far exceeding what most, if not all, natural, human eyes could see. He says to the manufacturer, "If only you could see what I've seen with your eyes."
Share
Share
Copy Link
Penn Medicine's AI-powered imaging system, AInSights, enhances radiological precision by creating 3D organ models, enabling early detection of health issues and potentially extending patients' lives.
Penn Medicine has introduced a groundbreaking AI-guided imaging system called Penn AInSights, which is revolutionizing the field of radiology and early disease detection. This innovative technology, recently named a CIO 100 winner, creates precise three-dimensional views of internal organs, enabling clinicians to identify potential health issues with unprecedented accuracy 123.
AInSights utilizes artificial intelligence to analyze thousands of medical images, effectively building digital 3D models of organs. This advanced system can quickly process a vast amount of imaging data, flagging potential issues within the existing technological framework used by clinicians 123.
Dr. Charles Kahn, a professor of Radiology at the University of Pennsylvania, explains the system's advantage: "When you look at the liver, you say, 'Okay, is this normal?' You eyeball it and use some measurements to say whether it's big or small. But sometimes, it isn't as easy as that" 123.
The AI-powered system addresses limitations in traditional radiological assessments, which often rely on two-dimensional measurements to evaluate three-dimensional organs. AInSights provides a more comprehensive analysis, potentially identifying issues that might be missed by conventional methods 123.
According to a recent study published in the Journal of Imaging Informatics in Medicine, AInSights has significantly reduced the turnaround time for CT scans of the abdomen to just 2.5 minutes 123.
Unlike many other AI solutions in radiology, AInSights has been successfully integrated into existing clinical workflows. Walter Witschey, Ph.D., an associate professor of Radiology involved in developing the program, notes: "The model looks at the images, generates AI annotations and quantifies the traits of what it's looking at -- that's given to the radiologist, all automatically" 123.
AInSights enables "opportunistic screening," allowing clinicians to examine multiple organs during a single scan. For instance, a CT scan primarily focused on monitoring a kidney condition can also screen the liver, spleen, pancreas, and lungs for potential issues 123.
By providing early detection of health issues such as fatty liver disease, diabetes warning signs, and potential kidney failure, AInSights empowers Penn Medicine's clinical staff to intervene sooner and more effectively. This proactive approach has the potential to add years to patients' lives 123.
The system is currently analyzing approximately 2,000 scans of the abdomen or chest per month at Penn Medicine, demonstrating its scalability and real-world impact 123.
As AI continues to advance in the medical field, tools like AInSights are bringing us closer to the life-extending capabilities once relegated to science fiction, potentially realizing the dream of longer, healthier lives through early intervention and precise diagnosis.
Reference
[1]
[2]
Researchers develop BiomedParse, an AI model capable of analyzing nine types of medical images to predict systemic diseases, potentially revolutionizing medical diagnostics and improving efficiency for healthcare professionals.
2 Sources
2 Sources
Artificial intelligence is making significant strides in the early detection of dementia and monitoring of brain health. Researchers are developing AI tools that could revolutionize how we diagnose and manage cognitive decline.
2 Sources
2 Sources
Researchers have developed an AI system that can detect esophageal cancer at an early stage, potentially saving thousands of lives. This groundbreaking technology could revolutionize cancer screening and treatment.
2 Sources
2 Sources
AI's ability to generate 'hallucinations' is being harnessed by scientists to accelerate research and innovation across various fields, from medicine to chemistry, challenging the negative perception of AI-generated content.
2 Sources
2 Sources
As artificial intelligence continues to evolve at an unprecedented pace, experts debate its potential to revolutionize industries while others warn of the approaching technological singularity. The manifestation of unusual AI behaviors raises concerns about the widespread adoption of this largely misunderstood technology.
2 Sources
2 Sources