The global response to the COVID-19 pandemic has demonstrated the significant societal benefits of recent scientific and technological advances, such as the use of synthetic biology in vaccine development and of gene editing tools such as CRISPR in the development rapid diagnostic tests. But with these advances comes the challenge of ensuring they are used responsibly and for peaceful purposes.
The dual-use nature of many new biological techniques pose significant challenges for governance, particularly as new and complex synergies emerge between biology and other disciplines, such as chemistry, artificial intelligence (AI) and cyber technologies.
At the same time, emerging technologies are lowering the barriers to entry into the life sciences, making potentially harmful biological agents more accessible to a diverse range of actors. These developments have diversified biothreats beyond what was originally concieved by the Biological Weapons Convention (BWC), introducing new and more interconnected pathways for the hostile use of biological agents.
States parties to the BWC are convened in Geneva this week, an opportunity to take stock of its implementation and prepare for next year’s review conference. There is an urgent need to update the treaty regime by reframing discussions to properly address the complexity of modern-day biothreats.
Evolving regulatory challenges
The COVID-19 pandemic has sparked new discussions surrounding the security of labs that handle the most dangerous pathogens, so-called biosafety level 4 labs. There is often a lack of transparency surrounding the kinds of activities being undertaken in these labs, and little oversight or control of research that could pose an unduly risk. The Global Health Security Index found that less than five per cent of countries provide oversight of dual-use research.
Unlike its chemical weapons counterpart, the BWC does not have a verification mechanism. The intrinsically dual-use nature of biological research means that it cannot be verified in the same way as nuclear or chemical activities. The treaty regime therefore relies on confidence-building measures (CBMs) as its primary means of assessing compliance.
This reliance is increasingly challenged by rapid advances in science and technology, which outpace states’ ability to properly assess the potential risks of certain research. The same techniques that can be used for unquestionably peaceful purposes, such as the promotion of public health, could also be misused in the hands of a rogue state, terrorist group, or malicious actor.
Artificial intelligence and automated technologies are making the life sciences more accessible by simplifying complex processes and reducing the level of tacit knowledge previously required. Automated technologies enable research to be conducted remotely, transforming arduous manual processes into lines of code.
The emergence of fully-automated ‘cloud labs’ could provide a cost-effective means of accessing experimental biology. However, as the industry grows, company-governed customer screening will become more difficult and it may be necessary to develop more established regulations to ensure activities are only undertaken for peaceful purposes.
There is also the question of how to regulate ‘intangibles’ – the informational ingredients for the use of technologies, often embedded in tacit knowledge. Bypassing material export controls, intangible technology transfers cannot be as easily regulated by traditional list-based approaches.
Knowledge can be transferred instantly online, and publicly available research data can be accessed within a few clicks. Careful balance is required when weighing the potential risk of misuse against the right to participate in the fullest possible exchange of technological information.
Such digitalization of the life sciences has diversified biological threats beyond their ‘traditional’ understanding. The convergence of biology with cyber and AI technologies introduces new pathways for the hostile use of biological agents: a sophisticated cyber-attack has the potential to steal, exploit or manipulate data; or tamper with security systems connected to the Internet of Things (IoT) in high-risk labs containing dangerous pathogens. Increased collaboration across these converging disciplines is needed to understand new vulnerabilities emerging from the use of cyber and other technologies.
Efforts to improve regulation are further hampered by an increasingly polarized security environment. Political divisions and mistrust among states parties have stalled BWC discussions in recent years, meaning legislative change continues to lag behind scientific and technological advances.
The future of regulation
COVID-19 has demonstrated the need for the BWC to adapt to a new landscape of complex biothreats and address new pathways to infectious disease outbreaks. Potential ways forward could include:
- Greater transparency surrounding research in high-risk labs to ensure activities are being undertaken safely, and for unquestionably peaceful purposes. States parties have long considered the development of a peer review mechanism to enhance confidence in national implementation. Such a mechanism could also provide a means for compliance assessments in a collaborative way.
- Enhance participation in CBMs. In 2021, fewer than half of BWC states parties submitted CBM reports. In light of COVID-19, increased awareness of the need for greater international cooperation should help encourage the submission of reports.
- Establish an independent technical body mandated by BWC states parties to review and report on developments in science and technology. Such a body could also bridge gaps between converging areas of concern by working with UN entities and experts in related disarmament forums, such as cyber.
To ensure the BWC is fit for purpose in this new biothreat landscape, science and technology must be woven into discussions under all its articles. This would help find new and flexible ways to keep pace with rapid technological advancements, without hampering positive scientific innovation.
Emma Saunders, Research Assistant, International Security Programme.