Log in
The petrochemical industry is eager to develop advanced fuels which improve fuel efficiency both for economic and environmental reasons. Statistics plays a crucial role in this costly process. Innovative Bayesian methodology developed by Gilmour was applied at Shell Global Solutions to data from fuel experiments to solve a recurring statistical problem. The usefulness of this approach to the wider petrochemical industry has been recognized by the industry-based Coordinating European Council (CEC) for the Development of Performance Tests for Fuels, Lubricants and other Fluids, who in their statistics manual have included Gilmour's method as an alternative to procedures in the ISO 5725 standard.
The Computational Optimization Group (COG) in the Department of Computing produced new models, algorithms, and approximations for supporting confident decision-making under uncertainty — when computational alternatives are scarce or unavailable. The impact of this research is exemplified by the following:
A generalized additive model (GAM) explores the extent to which a single output variable of a complex system in a noisy environment can be described by a sum of smooth functions of several input variables.
Bath research has substantially improved the estimation and formulation of GAMs and hence
This improved statistical infrastructure has resulted in improved data analysis by practitioners in fields such as natural resource management, energy load prediction, environmental impact assessment, climate policy, epidemiology, finance and economics. In REF impact terms, such changes in practice by practitioners leads ultimately to direct economic and societal benefits, health benefits and policy changes. Below, these impacts are illustrated via two specific examples: (1) use of the methods by the energy company EDF for electricity load forecasting and (2) their use in environmental management. The statistical methods are implemented in R via the software package mgcv, largely written at Bath. As a `recommended' R package mgcv has also contributed to the global growth of R, which currently has an estimated 1.2M business users worldwide [A].
Techniques developed at The University of Nottingham (UoN) have enabled organisations to deal with uncertainty in complex industrial and policy problems that rely on the elicitation of expert opinion and knowledge. The statistical toolkit produced for use in complex decision-making processes has been deployed in a wide range of applications. It has been particularly useful in asset management planning in organisations such as the London Underground, government approaches to evidence-based policy, and the Met Office UK Climate Projection tool (UKCP09), which is used by hundreds of organisations across the UK such as environment agencies, city and county councils, water companies and tourist boards.
Our research has been applied directly by Aviva plc. to develop improved products in the general insurance market (e.g. household and car) and in the more specialised area of enhanced pension annuities. As a result, Aviva has become more competitive in these markets and customers are enjoying better value for money. In the case of enhanced annuities, the benefits are in the form of higher pension income for those accurately identified as facing shortened life expectancies. Aviva is the largest insurance company in the UK and the sixth largest in the world.
The WinBUGS software (and now OpenBUGS software), developed initially at Cambridge from 1989-1996 and then further at Imperial from 1996-2007, has made practical MCMC Bayesian methods readily available to applied statisticians and data analysts. The software has been instrumental in facilitating routine Bayesian analysis of a vast range of complex statistical problems covering a wide spectrum of application areas, and over 20 years after its inception, it remains the leading software tool for applied Bayesian analysis among both academic and non-academic communities internationally. WinBUGS had over 30,000 registered users as of 2009 (the software is now open-source and users are no longer required to register) and a Google search on the term `WinBUGS' returns over 205,000 hits (over 42,000 of which are since 2008) with applications as diverse as astrostatistics, solar radiation modelling, fish stock assessments, credit risk assessment, production of disease maps and atlases, drug development and healthcare provider profiling.
Methods of emulation, model calibration and uncertainty analysis developed by Professor Tony O'Hagan and his team at The University of Nottingham (UoN) have formed the basis of Pratt & Whitney's Design for Variation (DFV) initiative which was established in 2008. The global aerospace manufacturers describe the initiative as a `paradigm shift' that aims to account for all sources of uncertainty and variation across their entire design process.
Pratt & Whitney considers their implementation of the methods to provide competitive advantage, and published savings from Pratt & Whitney adopting the DFV approach for a fleet of military aircraft are estimated to be approximately US$1billion.
Reversible Jump Markov chain Monte Carlo, introduced by Peter Green [1] in 1995, was the first generic technique for conducting the computations necessary for joint Bayesian inference about models and their parameters, and it remains by far the most widely used, 18 years after its introduction. The paper has been (by September 2013) cited over 3800 times in the academic literature, according to Google Scholar, the vast majority of the citing articles being outside statistics and mathematics. This case study, however, focusses on substantive applications outside academic research altogether, in the geophysical sciences, ecology and the environment, agriculture, medicine, social science, commerce and engineering.
The Thermal Oxide Reprocessing Plant (THORP) at Sellafield in Cumbria has 20% of the world's current annual nuclear reprocessing capacity. Statistical methods developed in Newcastle during commissioning of THORP are integral to the nuclear material accountancy systems that are used in all stages of reprocessing. Since 2008 a Newcastle volumetric calibration system has been the only means of determining input into the plant. Regulatory approval of the system has ensured that THORP has been able to operate throughout the REF period, that customer costing has been accurate, and that the plant has complied with international standards for the close control of nuclear materials.
Designs for complex structures like cars, aeroplanes and modern buildings suffer from unpredictable vibrations that lead to anything from irritating noises to dangerous structural failures. Predicting the distribution of vibrational energy in large coupled systems is an important and challenging task of major interest to industry. Until recently there was no reliable method to predict vibrations at the important mid-to-high frequency ranges.
There is a need to gain accurate predictions of vibrations at the design stage. However, previous techniques developed in the context of Quantum Chaos are too cumbersome to be used in a fast-moving commercial design setting. Bandtlow has used his expertise to develop a novel method that computes a very close approximation to these predictions but in a reasonable time. Bandtlow's method of constructing an efficient mathematical model for spectral vibrations has informed inuTech's latest product and led to enhanced performance of automobiles and aircraft.