* A Grant DOI (digital object identifier) is a unique, open, global, persistent and machine-actionable identifier for a grant.
Can an integrated science of global risk safeguard humanity’s long-term future?
Global risks are those whose impact cannot be contained in individual regions or industries. Examples include pandemics, biodiversity loss, and the malicious use of artificial intelligence. While some research and policy communities focus on specific global risks, such as climate change or nuclear war, such risks will not emerge in isolation. There remains a need to study this class of risks as a whole.
A Science of Global Risk (SGR) aims to develop an interdisciplinary research effort to analyze emerging global risks and develop new solutions for managing them. The project will be led by a research team based at the University of Cambridge’s Centre for the Study of Existential Risk. Working with a global network of collaborators, they will build on the success of the pilot project Managing Extreme Technological Risk.
Achieving meaningful action requires a complex level of collaboration and coordination from scientists, policy-makers, developers, and other stakeholders worldwide. Directed by Lord Martin Rees, the project will develop the basis for a science of global risk that is methodologically rigorous and creative, focused on risk management and policy formation, and accessible to a diverse community of stakeholders across academia, industry, government and civil society. The project consists of three interwoven strands:
The project will result in a series of high-profile academic papers, policy reports, two books, and a range of media outputs for broader public engagement. These outputs will support engagement with interdisciplinary academic expertise, relevant policy communities, research leaders and industry