Log in
Climate change is one of the defining challenges of our time. The net costs of climate change in the UK could be tens of billions of pounds per year in the 2050s, and tidal flooding alone could affect over half a million UK properties by 2100. Dr Jonathan Rougier worked with the UK Met Office (UKMO) to produce the climate scenarios for the UK Climate Impacts Programme (UKCIP) 2009 report (UKCP09). His research and advice (funded as a UKMO External Expert) was critical in a key innovation in the UKCP09: a comprehensive uncertainty assessment. A Director of the UKCIP writes "The UKMO team with Dr Rougier [have] put the UK at the leading edge of the science and service aspects of providing climate information for users" [b].
The UKCP09 formed the basis of the UK Climate Change Risk Assessment and the recommendations of the UK National Adaptation Programme, which was submitted to Parliament as part of the Government's obligations under the Climate Change Act. The UKCP09 has been used for the assessment of the impact of climate change by hundreds of organisations, including agencies and non-governmental organisations (NGOs), utilities companies, consultancies, and County Councils and Local Authorities.
As the realities of climate change have become more widely accepted over the last decade, decision makers have requested projections of future changes and impacts. Founded in 2002, the Centre for Analysis of Time Series (CATS) has conducted research revealing how the limited fidelity of climate models reduces the relevance of cost-benefit style management in this context: actions based on ill-founded projections (including probabilistic projections) can lead to maladaptation and poor policy choice. CATS' conclusions were noted in the Fourth Assessment Report of the Intergovernmental Panel on Climate Change (IPCC) report and led in turn to the toning down of the UK Climate Projections 2009 and the 2012 UK Climate Change Risk Assessment. Members of the insurance sector, energy sector, national security agencies, scientific bodies and governments have modified their approaches to climate risk management as a direct result of understanding CATS' research. Attempts to reinterpret climate model output and design computer experiments for more effective decision support have also resulted.
Methods of emulation, model calibration and uncertainty analysis developed by Professor Tony O'Hagan and his team at The University of Nottingham (UoN) have formed the basis of Pratt & Whitney's Design for Variation (DFV) initiative which was established in 2008. The global aerospace manufacturers describe the initiative as a `paradigm shift' that aims to account for all sources of uncertainty and variation across their entire design process.
Pratt & Whitney considers their implementation of the methods to provide competitive advantage, and published savings from Pratt & Whitney adopting the DFV approach for a fleet of military aircraft are estimated to be approximately US$1billion.
Pratt & Whitney (one of the world's largest makers of aircraft engines) has developed a process, "Design for Variation" (DFV), that uses Bayesian methods developed at Sheffield for analysing uncertainty in computer model predictions within the design, manufacture and service of aircraft engines. The DFV process significantly improves cost efficiency by increasing the time an engine stays operational on the wing of an aircraft, so reducing the time that the aircraft is unavailable due to engine maintenance. DFV also saves costs by identifying design and process features that have little impact on engine performance, but are expensive to maintain. Pratt & Whitney estimate the DFV process to generate savings, for a large fleet of military aircraft, of [text removed for publication].
The UK Food and Environment Research Agency (Fera) has used these methods in their risk analyses, for example in assessing risks of exposure to pesticides.
Reversible Jump Markov chain Monte Carlo, introduced by Peter Green [1] in 1995, was the first generic technique for conducting the computations necessary for joint Bayesian inference about models and their parameters, and it remains by far the most widely used, 18 years after its introduction. The paper has been (by September 2013) cited over 3800 times in the academic literature, according to Google Scholar, the vast majority of the citing articles being outside statistics and mathematics. This case study, however, focusses on substantive applications outside academic research altogether, in the geophysical sciences, ecology and the environment, agriculture, medicine, social science, commerce and engineering.
The Computational Optimization Group (COG) in the Department of Computing produced new models, algorithms, and approximations for supporting confident decision-making under uncertainty — when computational alternatives are scarce or unavailable. The impact of this research is exemplified by the following:
Techniques developed at The University of Nottingham (UoN) have enabled organisations to deal with uncertainty in complex industrial and policy problems that rely on the elicitation of expert opinion and knowledge. The statistical toolkit produced for use in complex decision-making processes has been deployed in a wide range of applications. It has been particularly useful in asset management planning in organisations such as the London Underground, government approaches to evidence-based policy, and the Met Office UK Climate Projection tool (UKCP09), which is used by hundreds of organisations across the UK such as environment agencies, city and county councils, water companies and tourist boards.
Air pollution poses significant threats to both the environment and to human health and the World Health Organization estimates that 800,000 deaths per year could be related to ambient air pollution. Formulating air quality legislation and understanding its effect on human health requires accurate information on ambient concentrations of air pollution and how these translate into exposures actually experienced by individuals (personal exposures).
Our research provides a framework for estimating personal exposures for specific susceptible sub-populations, such as the elderly and those suffering from respiratory diseases. This framework also provides novel means of assessing uncertainty associated with the estimates of exposures. Furthermore, it allows changes in exposures to be assessed under hypothetical scenarios reflecting potential regulatory changes.
These models were used in the US Environmental Protection Agency's (EPA) recent review of ozone standards that resulted in a reduction in the statutory limits of ozone in the United States. The EPA stated that "These changes will improve both public health protection and the protection of sensitive trees and plants" [C].
This study demonstrates how Bayes linear methodologies developed at Durham University have impacted on industrial practice. Two examples are given. The approach has been applied by London Underground Ltd. to the management of bridges, stations and other civil engineering assets, enabling a whole-life strategic approach to maintenance and renewal to reduce costs and increase safety. The approach has won a major award for innovation in engineering and technology. The methodology has also been applied by Unilever and Fera to improve methods of assessing product safety and in particular the risk of chemical ingredients in products causing allergic skin reactions.
ENABLE is a history matching and uncertainty assessment software system for the oil industry, whose inference engine was produced by the Durham Statistics group, based on their research on uncertainty quantification for complex physical systems modelled by computer simulators. The system optimizes asset management plans by careful uncertainty quantification and reduces development costs by accelerating the history matching process for oil reservoirs, resulting in more informed technical and economic decision-making. ENABLE was acquired by Roxar ASA in 2006 and current users include the multinational oil company Statoil. From January 2008 to September 2012 (the most recent set of figures) the turnover attributed to ENABLE was [text removed for publication].