Curated by THEOUTPOST
On Fri, 6 Dec, 12:02 AM UTC
2 Sources
[1]
Taming Big Data and Particle Beams: How SLAC Researchers Are Pushing AI to the Edge
This is the first of a two-part series exploring a sampling of ways artificial intelligence helps researchers from around the world perform cutting-edge science with the lab's state-of-the-art facilities and instruments. Read part two here. Newswise -- Every day, researchers at the Department of Energy's SLAC National Accelerator Laboratory tackle some of the biggest questions in science and technology - from laying the foundations for new drugs to developing new battery materials and solving big data challenges associated with particle physics and cosmology. To get a hand with that work, they are increasingly turning to artificial intelligence. "AI will help accelerate our science and technology further," said Ryan Coffee, a SLAC senior scientist. "I am really excited about that." Fine tuning particle beams for studying speedy atoms and molecules Understanding the structure and behavior of atoms and molecules for materials or biological applications requires sophisticated X-ray and ultrafast instruments and machines, such as SLAC's Linac Coherent Light Source (LCLS), Stanford Synchrotron Radiation Lightsource (SSRL) and the Megaelectronvolt Ultrafast Electron Diffraction (MeV-UED) instrument, that can reveal nature at the smallest and fastest scales through, for example, molecular movies. These scientific endeavors, however, require finely tuned machines and create massive volumes of complex data at ultrafast rates. SLAC researchers are turning these challenges into an opportunity to drive and lead a new era of machine learning tools to optimize these facilities, experiments and data management. Particle accelerators are the backbone of SLAC's X-ray and ultrafast facilities, creating unprecedented opportunities for the large global research community. One challenge is quickly tuning the electron beam that generates the X-rays for the unique requirements of each experiment. Experienced operators must consider and adjust hundreds of parameters with limited information, so it can be hard to see how the adjustments are exactly shaping the beam and to determine what to try next. Machine learning tools make this process easier, so researchers can spend less time tuning. "You want a more cohesive picture of the beam when tuning - the ability to flexibly, quickly adjust settings to produce beams that each researcher wants, dynamically control those beams in real time and have some indication of how that is feeding back into the end science goals. For LCLS we want to be able to rapidly switch between different configurations for different researchers," said Auralee Edelen, SLAC accelerator scientist. One method Edelen's team has been working on is phase space reconstruction, where a machine learning tool combines a physics simulation with a machine learning model to better visualize the beam more quickly with a few data points. In contrast, some methods that use machine learning models to make predictions about the beam distribution can take thousands of data points to train, while other reconstruction methods can take many hours of data gathering and computation. This tool can reduce the time to visualize the beam from many hours to just a few minutes with a simple setup that can be found at almost every accelerator facility. Ryan Roussel, a member of Edelen's team who is leading the development of the technique, is now working on bringing it into regular operation at the LCLS facility's upgraded X-ray laser. "We're trying to make it easier to understand what's going on in the accelerator at any given point in time," Edelen said. SLAC's machine learning tools have been deployed at other accelerators around the world and can be adapted for other types of instruments, such as the MeV-UED. "We have tried to make sure that the software tools we make have very specific tasks and have standard interfaces," said Edelen. "We modularize those tools to make it as easy as possible to swap out individual pieces as needed, which helps when trying to apply them to different systems with different software ecosystems and sets of needs." Working on the edge Among AI's strong suits is handling massive amounts of data arriving over short time periods. Such abilities come in handy at the LCLS facility's upgraded X-ray laser, where experiments are expected to churn out data at an astonishing one terabyte per second. This is analogous to streaming about 1,000 full-length movies per second, says Abhilasha Dave, a SLAC digital design engineer. Conventional methods of storing data on a computer for processing and analyzing afterwards will not be feasible due to the high power consumption and storage space required and costs involved. SLAC's solution is edge machine learning, which enables processing and analyzing the data on special hardware, called a field programmable gate array (FPGA), on the instrument detector close to the data source, the so-called "edge." "We're working on how to accelerate this data flow so that you can analyze data in flight," said Coffee. Edge machine learning reduces the amount of data to a minimum useful set that can be easily stored and reduces the need for expensive and power-hungry computing. To do this, however, traditional machine learning models used on a computer must now shrink in size to fit in the limited space on the FPGA. Dave has a few options for downsizing the model. "We first explore techniques to reduce the model size through several strategies," she explained. "One approach involves training the machine learning model on data that has already been compressed. Another method focuses on developing faster algorithms, and a third involves grouping the data together to reduce its complexity. Which strategy we choose depends on the specific application." After training the model and then ensuring it performs with the required accuracy, Dave needs to move the model to the FPGA, a completely different kind of computer from the one used for training - and a computer without standardized support for machine learning models that Dave can use. To bridge this gap, SLAC software engineer J.J. Russell created the SLAC Neural Network Library (SNL). The SNL serves as an intermediary between the model and the FPGA by translating the model's actions into instructions the FPGA understands. Now, the FPGA can be put into the instrument to keep up with huge volumes of data arriving at blazing rates from machines such as LCLS - edge machine learning in action. With ever-increasing need for edge machine learning on the horizon, the team designed SNL to be easy to use for even a novice to machine learning, helping ensure that its impact will cut across many fields of science. As SLAC teams across the lab continue to invent, refine and expand the capabilities of machine learning tools, they are looking to design them to be flexibly applied to other instruments, facilities and applications within and outside of the lab. "We purposefully try to design all of our architectures, software and firmware to be very widely applicable, so doing work on one project may benefit another project," said Ryan Herbst, director of the Technology Innovation Directorate's Instrumentation Division. "We're starting to think about areas in distributed sensing environments, such as the smart grid and low latency control systems for fusion power plants." "Ultimately, everything is going to benefit from machine learning, just like everything benefitted from the computer," said Coffee, adding, "This year's Nobel Prizes in physics and chemistry emphasize the role AI is now playing in science." These advances reflect the growing use of AI throughout the lab, says SLAC scientist Daniel Ratner, a leading AI and machine learning scientist at SLAC. "Looking forward, we expect to see more real-time AI in the loop, guidance during data acquisition, facility control and scientific exploration across the lab's mission." This research was supported by the DOE Office of Science. LCLS and SSRL are DOE Office of Science user facilities.
[2]
Dark Matter, Neutrinos and Drug Discovery: How AI Is Powering SLAC Science and Technology
This is one of the goals of a DOE-funded BRaVE consortium led by SSRL scientists Derek Mendez and Aina Cohen, who co-directs SSRL's Structural Molecular Biology Resource. The team is advancing U.S. biopreparedeness, in part by developing new AI tools that simplify time-consuming and complex steps involved in the structure-based drug design process. For example, AI tools on the SSRL's Structural Molecular Biology beamlines are busy analyzing diffraction images, which help researchers understand the structure of biological molecules, how they function and how they interact with new drug-like compounds. These tools provide real-time information on data quality, such as the integrity of the protein crystals being studied. Ideally, single crystals are used for these experiments, but the crystals are not always well ordered. Sometimes, they break or stick together, potentially compromising data quality, and these problems are often not apparent until researchers have analyzed the collected data. These experiments produce hundreds to many thousands of diffraction patterns at high rates. Manually inspecting all that data to weed out defective crystal patterns will be nearly impossible, so researchers are turning to AI tools to automate this process. "We developed an AI model to assess the quality of diffraction pattern images 100 times faster than the process we used before," said Mendez. "I like using AI for simplifying time-consuming tasks. That can really help free up time for researchers to explore other, more interesting aspects of their research. Overall, I am excited about finding ways that artificial and natural intelligence can work together to improve research quality." Researchers at LCLS are now using these tools, said Cohen - and, Mendez said, there's a growing interplay between the two. "LCLS is working on tools for diffraction analysis that we are interested in applying at SSRL," added Mendez. "We are building that synergy." Frédéric Poitevin, a staff scientist at LCLS, agrees. "Working together is key to facing the unique challenges at both facilities." Among the many AI tools Poitevin's team is developing is one that speeds up the analysis of complex diffraction images that help researchers visualize the structure and behavior of biological molecules in action. Extracting this information involves considering subtle variations in the intensity of millions of pixels across several hundred thousand images. Kevin Dalton, a staff scientist on Poitevin's team, trained an AI model that can analyze the large volume of data and search for important faint signals much more quickly and accurately than traditional methods. "With the AI models that Kevin is developing, we have the potential at LCLS and SSRL to open a completely new window into the molecular structure and behavior we have been after but have never been able to see with traditional approaches," said Poitevin. Beyond specific AI projects, Wechsler said, the growing importance and utility of AI in many science areas is opening more doors for collaboration among SLAC scientists, engineers and students. "I am excited about the SLAC AI community developing," she said. "We have more to do and learn from each other across disciplines. We have many commonalities in what we want to accomplish in astronomy, particle physics and other areas of science at the lab, so there is a lot of potential." The increasing opportunities for collaboration are driving teams across SLAC to identify where AI tools are needed and develop workflows that can be applied across the lab. This effort is an important part of the lab's overall strategy in harnessing AI and computing power for advancing science today and into the future. Adds Ratner, "By also leveraging our partnership with AI experts at Stanford University, we work together to deepen our AI knowledge and build AI tools that enable discovery and innovative technology for exploring science at the biggest, smallest and fastest scales." The research was supported by the DOE Office of Science. LCLS and SSRL are DOE Office of Science user facilities.
Share
Share
Copy Link
Researchers at SLAC are leveraging artificial intelligence to optimize particle accelerators, process big data, and accelerate drug discovery, pushing the boundaries of scientific exploration.
Researchers at the Department of Energy's SLAC National Accelerator Laboratory are increasingly turning to artificial intelligence (AI) to tackle complex scientific challenges. The integration of AI is transforming various aspects of their work, from optimizing particle accelerators to managing big data in particle physics and cosmology 1.
One significant application of AI is in fine-tuning particle beams for studying atoms and molecules. SLAC's facilities, including the Linac Coherent Light Source (LCLS) and Stanford Synchrotron Radiation Lightsource (SSRL), require precise beam adjustments for each experiment. Machine learning tools are making this process more efficient, allowing researchers to spend less time on tuning and more on actual scientific work 1.
SLAC is pioneering the use of edge machine learning to handle the massive data output from its upgraded X-ray laser facility. With experiments generating up to one terabyte of data per second, conventional data storage and processing methods are no longer feasible. Edge machine learning enables data processing and analysis directly on the instrument detector, significantly reducing data volume and power consumption 1.
The BRaVE consortium, led by SSRL scientists, is developing AI tools to streamline the structure-based drug design process. These tools analyze diffraction images in real-time, providing crucial information on data quality and crystal integrity. An AI model developed by the team can assess the quality of diffraction pattern images 100 times faster than previous methods, greatly accelerating the drug discovery process 2.
The growing importance of AI is fostering collaboration among SLAC scientists, engineers, and students across various disciplines. Teams are working together to identify where AI tools are needed and develop workflows that can be applied across the lab. This collaborative effort is a key part of SLAC's strategy to harness AI and computing power for advancing science 2.
SLAC is leveraging its partnership with AI experts at Stanford University to deepen its AI knowledge and build tools that enable discovery and innovative technology. This collaboration aims to explore science at the biggest, smallest, and fastest scales, pushing the boundaries of what's possible in scientific research 2.
As AI continues to evolve and integrate into scientific processes, SLAC remains at the forefront of this technological revolution, driving advancements in particle physics, structural biology, and beyond. The synergy between artificial and natural intelligence is opening new windows into molecular structures and behaviors, promising exciting discoveries in the near future.
The National Synchrotron Light Source II (NSLS-II) at Brookhaven National Laboratory is leveraging AI and machine learning to enhance research efficiency, automate processes, and tackle data challenges in synchrotron experiments.
2 Sources
2 Sources
Researchers at Jefferson Lab are using AI models in a daily competition to improve data center efficiency and reduce costs for large-scale scientific experiments.
2 Sources
2 Sources
Researchers at Argonne National Laboratory have developed an innovative AI-driven framework called MProt-DPO that accelerates protein design by integrating multimodal data and leveraging supercomputers, potentially transforming fields from vaccine development to environmental science.
2 Sources
2 Sources
The LCLS-II, the world's most powerful X-ray laser at SLAC National Accelerator Laboratory, is undergoing a significant upgrade to enhance its capabilities in atomic-level imaging and ultrafast science.
2 Sources
2 Sources
Researchers at SMU have developed SmartCADD, an open-source tool that combines AI, quantum mechanics, and computer-assisted drug design to significantly speed up the drug discovery process.
4 Sources
4 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved