2 Sources
[1]
Cambridge launches program to tackle engineered pandemic risks
University of CambridgeFeb 27 2025 Covid-19 showed us how vulnerable the world is to pandemics - but what if the next pandemic were somehow engineered? How would the world respond - and could we stop it happening in the first place? These are some of the questions being addressed by a new initiative launched today at the University of Cambridge, which seeks to address the urgent challenge of managing the risks of future engineered pandemics. The Engineered Pandemics Risk Management Programme aims to understand the social and biological factors that might drive an engineered pandemic and to make a major contribution to building the UK's capability for managing these risks. It will build a network of experts from academia, government, and industry to tackle the problem. Increased security threats from state and non-state actors, combined with increased urbanisation and global mobility, means the threat of deliberate pathogen release must be taken seriously as must other intertwined aspects of pandemic risk such as mis- and disinformation, the erosion of trust in a number of institutions and an increasingly volatile geopolitical context. Further potential risks are posed by recent developments in gene-editing tools and artificial intelligence, which have rapidly advanced technological capability that may make it easier to engineer potential pandemic pathogens. There is a great opportunity to take a joined-up approach to managing the risks posed by engineered pandemics. We need experts and agencies across the spectrum to work together to develop a better understanding of who or what might drive such events and what their likely impact would be. And we need evidence-informed policies and networks in place that would help us respond to - or better still, prevent - such an eventuality." Professor Clare Bryant, Department of Medicine, University of Cambridge The aims of the Engineered Pandemics Risk Management Programme are: To develop the conceptual underpinnings for the risk management of engineered pandemics based on interdisciplinary research To support the capability of the UK's engineered pandemic risk policy and practice, including building and maintaining networks that connect government, academia and industry. To strengthen the international networks that will support this work globally There are four main strands of work: Social determinants of engineered pandemic threat This strand will look at the actors who have the potential to engineer harmful pathogens, either deliberately or accidentally. It will ask questions such as: What could motivate bioterrorism in the coming decades? Who might the relevant actors be? What are the kinds of engineered pandemic that someone might want to create? Dr Rob Doubleday, Executive Director of the Centre for Science and Policy at the University of Cambridge, said: "The common narrative is that there's a wide range of potential actors out there who want to create bioweapons but don't yet have the technical means. But in fact, there's been very little work to really understand who these people might be, and their relationship to emerging technology. To explore these questions, we need a broad network including social scientists, biosecurity researchers, criminologists, experts in geopolitics and counterterrorism." The strand will also look at the governance of scientific research in areas that may facilitate an engineered pandemic, whether unwittingly or maliciously, aiming to deliver a policy framework that enables freedom of intellectual research while managing real and apparent risk in infectious disease research. Professor Bryant said: "As scientists, we're largely responsible for policing our own work and ensuring integrity, trustworthiness and transparency, and for considering the consequences of new knowledge and how it might be used. But with the rapid progress of genomic technologies and AI, self-regulation becomes more difficult to manage. We need to find governance frameworks that balance essential scientific progress with its potential misapplication." Biological determinants of engineered pandemic threat Recognising that the most likely cause of an engineered pandemic would be the deliberate release of a naturally-occurring pathogen - viral or bacterial, for example - rather than a man-made pathogen, this strand aims to understand what might make a particular pathogen infectious and how our immune systems respond to infection. This knowledge will allow researchers to screen currently available drugs to prevent or treat infection and to design vaccines quickly should a pandemic occur. Modelling threats and risk management of engineered pandemics The Covid-19 pandemic highlighted practical problems of dealing with pandemic infections, from the provision of personal protective equipment (PPE) to ensuring a sufficient supply of vaccine doses and availability of key medications. Modelling the potential requirements of a pandemic, how they could be delivered, how ventilation systems could be modified, what biosafety measures could be taken, for example, are all key challenges for managing any form of pandemic. This strand will address how existing modelling approaches would need to be adapted for a range of plausible engineered pandemics. Policy innovation challenges Working with the policy community, the Cambridge team will co-create research that directly addresses policy needs and involves policy makers. It will support policy makers in experimenting with more joined-up approaches through testing, learning and adapting solutions developed in partnership. The Engineered Pandemics Risk Management Programme is supported by a £5.25 million donation to the Centre for Research in the Arts, Humanities and Social Sciences (CRASSH) at the University of Cambridge. The team intends it to form a central component of a future Pandemic Risk Management Centre, for which it is now fundraising. Professor Joanna Page, Director of CRASSH, said: "Cambridge has strengths across a broad range of disciplines - from genetics and immunology to mathematical modelling to existential risk and policy engagement - that can make a much-needed initiative such as this a success." University of Cambridge
[2]
Cambridge initiative to address risks of future engineered pandemics
COVID-19 showed us how vulnerable the world is to pandemics -- but what if the next pandemic were somehow engineered? How would the world respond -- and could we stop it happening in the first place? These are some of the questions being addressed by a new initiative launched today at the University of Cambridge, which seeks to address the urgent challenge of managing the risks of future engineered pandemics. The Engineered Pandemics Risk Management Program aims to understand the social and biological factors that might drive an engineered pandemic and to make a major contribution to building the UK's capability for managing these risks. It will build a network of experts from academia, government, and industry to tackle the problem. Increased security threats from state and non-state actors, combined with increased urbanization and global mobility, means the threat of deliberate pathogen release must be taken seriously, as must other intertwined aspects of pandemic risk, such as mis- and disinformation, the erosion of trust in a number of institutions and an increasingly volatile geopolitical context. Further potential risks are posed by recent developments in gene-editing tools and artificial intelligence, which have rapidly advanced technological capabilities that may make it easier to engineer potential pandemic pathogens. Professor Clare Bryant from the Department of Medicine at the University of Cambridge said, "There is a great opportunity to take a joined-up approach to managing the risks posed by engineered pandemics. "We need experts and agencies across the spectrum to work together to develop a better understanding of who or what might drive such events and what their likely impact would be. And we need evidence-informed policies and networks in place that would help us respond to -- or better still, prevent -- such an eventuality." The aims of the Engineered Pandemics Risk Management Program are: There are four main strands of work: Social determinants of engineered pandemic threat This strand will look at the actors who have the potential to engineer harmful pathogens, either deliberately or accidentally. It will ask questions such as: What could motivate bioterrorism in the coming decades? Who might the relevant actors be? What are the kinds of engineered pandemic that someone might want to create? Dr. Rob Doubleday, Executive Director of the Center for Science and Policy at the University of Cambridge, said, "The common narrative is that there's a wide range of potential actors out there who want to create bioweapons but don't yet have the technical means. But in fact, there's been very little work to really understand who these people might be, and their relationship to emerging technology. "To explore these questions, we need a broad network including social scientists, biosecurity researchers, criminologists, experts in geopolitics and counter-terrorism." The strand will also look at the governance of scientific research in areas that may facilitate an engineered pandemic, whether unwittingly or maliciously, aiming to deliver a policy framework that enables freedom of intellectual research while managing real and apparent risk in infectious disease research. Professor Bryant said, "As scientists, we're largely responsible for policing our own work and ensuring integrity, trustworthiness and transparency, and for considering the consequences of new knowledge and how it might be used. But with the rapid progress of genomic technologies and AI, self-regulation becomes more difficult to manage. "We need to find governance frameworks that balance essential scientific progress with its potential misapplication." Biological determinants of engineered pandemic threat Recognizing that the most likely cause of an engineered pandemic would be the deliberate release of a naturally-occurring pathogen -- viral or bacterial, for example -- rather than a man-made pathogen, this strand aims to understand what might make a particular pathogen infectious and how our immune systems respond to infection. This knowledge will allow researchers to screen currently available drugs to prevent or treat infection and to design vaccines quickly should a pandemic occur. Modeling threats and risk management of engineered pandemics The COVID-19 pandemic highlighted the practical problems of dealing with pandemic infections, from the provision of personal protective equipment (PPE) to ensuring a sufficient supply of vaccine doses and the availability of key medications. Modeling the potential requirements of a pandemic, how they could be delivered, how ventilation systems could be modified, what biosafety measures could be taken, for example, are all key challenges for managing any form of pandemic. This strand will address how existing modeling approaches would need to be adapted for a range of plausible engineered pandemics. Policy innovation challenges Working with the policy community, the Cambridge team will co-create research that directly addresses policy needs and involves policy makers. It will support policy makers in experimenting with more joined-up approaches through testing, learning and adapting solutions developed in partnership. Professor Joanna Page, Director of CRASSH, said, "Cambridge has strengths across a broad range of disciplines -- from genetics and immunology to mathematical modeling to existential risk and policy engagement -- that can make a much-needed initiative such as this a success."
Share
Copy Link
The University of Cambridge has initiated a new program aimed at understanding and managing the risks associated with engineered pandemics, combining expertise from various fields to develop strategies for prevention and response.
The University of Cambridge has unveiled a groundbreaking initiative to address the potential threats posed by engineered pandemics. The Engineered Pandemics Risk Management Programme, launched in February 2025, aims to develop a comprehensive understanding of the social and biological factors that could lead to an engineered pandemic and enhance the UK's capacity to manage such risks 12.
The programme adopts a multifaceted approach, bringing together experts from academia, government, and industry to tackle this complex challenge. Professor Clare Bryant from the Department of Medicine at Cambridge emphasizes the need for a collaborative effort: "We need experts and agencies across the spectrum to work together to develop a better understanding of who or what might drive such events and what their likely impact would be" 1.
The initiative is structured around four main strands of work:
Social Determinants of Engineered Pandemic Threat: This strand explores potential actors and motivations behind bioterrorism. Dr. Rob Doubleday, Executive Director of the Centre for Science and Policy, highlights the importance of understanding the relationship between potential threat actors and emerging technologies 12.
Biological Determinants of Engineered Pandemic Threat: Recognizing that naturally occurring pathogens are more likely to be weaponized than synthetic ones, this strand focuses on understanding pathogen infectiousness and immune system responses 12.
Modelling Threats and Risk Management: Building on lessons from COVID-19, this area aims to develop models for managing various aspects of pandemic response, from PPE provision to vaccine distribution 12.
Policy Innovation Challenges: This strand involves working directly with policymakers to create research that addresses policy needs and supports more integrated approaches to pandemic management 12.
The programme also addresses the critical issue of governing scientific research that could potentially contribute to pandemic risks. Professor Bryant notes the challenges of self-regulation in the face of rapidly advancing genomic technologies and AI, emphasizing the need for governance frameworks that balance scientific progress with potential misuse 12.
The Engineered Pandemics Risk Management Programme is supported by a £5 million donation to the Centre for Research in the Arts, Humanities and Social Sciences (CRASSH) at the University of Cambridge 1.
While the programme is based at Cambridge and focuses on building UK capabilities, its findings and methodologies could have far-reaching implications for global pandemic preparedness. The initiative recognizes the interconnected nature of pandemic threats in an increasingly urbanized and mobile world, as well as the potential risks posed by emerging technologies such as gene-editing tools and artificial intelligence 12.
As the world continues to grapple with the aftermath of COVID-19, this forward-thinking programme represents a significant step towards proactively addressing the complex challenges of engineered pandemic risks. By combining expertise from diverse fields and fostering collaboration between academia, government, and industry, Cambridge University is positioning itself at the forefront of efforts to safeguard global health security in the face of evolving biological threats.
Databricks raises $1 billion in a new funding round, valuing the company at over $100 billion. The data analytics firm plans to invest in AI database technology and an AI agent platform, positioning itself for growth in the evolving AI market.
11 Sources
Business
10 hrs ago
11 Sources
Business
10 hrs ago
SoftBank makes a significant $2 billion investment in Intel, boosting the chipmaker's efforts to regain its competitive edge in the AI semiconductor market.
22 Sources
Business
18 hrs ago
22 Sources
Business
18 hrs ago
OpenAI introduces ChatGPT Go, a new subscription plan priced at ₹399 ($4.60) per month exclusively for Indian users, offering enhanced features and affordability to capture a larger market share.
15 Sources
Technology
18 hrs ago
15 Sources
Technology
18 hrs ago
Microsoft introduces a new AI-powered 'COPILOT' function in Excel, allowing users to perform complex data analysis and content generation using natural language prompts within spreadsheet cells.
8 Sources
Technology
11 hrs ago
8 Sources
Technology
11 hrs ago
Adobe launches Acrobat Studio, integrating AI assistants and PDF Spaces to transform document management and collaboration, marking a significant evolution in PDF technology.
10 Sources
Technology
10 hrs ago
10 Sources
Technology
10 hrs ago