Log in
The WinBUGS software (and now OpenBUGS software), developed initially at Cambridge from 1989-1996 and then further at Imperial from 1996-2007, has made practical MCMC Bayesian methods readily available to applied statisticians and data analysts. The software has been instrumental in facilitating routine Bayesian analysis of a vast range of complex statistical problems covering a wide spectrum of application areas, and over 20 years after its inception, it remains the leading software tool for applied Bayesian analysis among both academic and non-academic communities internationally. WinBUGS had over 30,000 registered users as of 2009 (the software is now open-source and users are no longer required to register) and a Google search on the term `WinBUGS' returns over 205,000 hits (over 42,000 of which are since 2008) with applications as diverse as astrostatistics, solar radiation modelling, fish stock assessments, credit risk assessment, production of disease maps and atlases, drug development and healthcare provider profiling.
In response to the deficiencies in bank risk management revealed following the 2008 financial crisis, one of the mandated requirements under the Basel III regulatory framework is for banks to backtest the internal models they use to price their assets and to calculate how much capital they require should a counterparty default. Qiwei Yao worked with the Quantitative Analyst — Exposure team at Barclays Bank, which is responsible for constructing the Barclays Counterpart Credit Risk (CCR) backtesting methodology. They made use of several statistical methods from Yao's research to construct the newly developed backtesting methodology which is now in operation at Barclays Bank. This puts the CCR assessment and management at Barclays in line with the Basel III regulatory capital framework.
The School of Mathematics at Cardiff University has developed important statistical and mathematical models for forecasting consumer buying behaviour. Enhancements to classical models, inspired by extensively studying their statistical properties, have allowed us to exploit their vast potential to benefit the sales and marketing strategies of manufacturing and retail organisations. The research has been endorsed and applied by Nielsen, the #1 global market research organisation that provides services to clients in 100 countries. Nielsen has utilised the models to augment profits and retain their globally leading corporate position. This has led to a US$30 million investment and been used to benefit major consumer goods manufacturers such as Pepsi, Kraft, Unilever, Nestlé and Procter & Gamble. Therefore the impact claimed is financial. Moreover, impact is also measurable in terms of public engagement since the work has been disseminated at a wide range of national and international corporate events and conferences. Beneficiaries include Tesco, Sainsbury's, GlaxoSmithKline and Mindshare WW.
Weather impacts all of our lives and we all take a close interest in it, with every news report finishing with a weather forecast watched by millions. Accurate weather forecasting is essential for the transport, agricultural and energy industries and the emergency and defence services. The Met Office plays a vital role by making 5-day forecasts, using advanced computer algorithms which combine numerical weather predictions (NWP) with carefully measured data (a process known as data assimilation). However, a major limitation on the accuracy of these forecasts is the sub- optimal use of this data. Adaptive methods, developed in a partnership between Bath and the Met Office have been employed to make better use of the data, thus improving the Met Office operational data assimilation system. This has lead to a significant improvement in forecast accuracy as measured by the `UK Index' [A] with great societal and economic impact. These forecasts, in particular of surface temperatures, are pivotal for the OpenRoad forecasting system used by local authorities to plan road clearing and gritting when snow or ice are predicted [B].
From 1995 Professor Munjiza's research at QMUL has led to the development of a series of algorithms which can predict the movement and relationship between objects. These algorithms have been commercialised by a range of international engineering and software companies including Orica, the world's leading blasting systems provider (via their MBM software package), and the software modelling company, Dassault Systems (via their Abaqus software). Through these commercialisation routes Munjiza's work has generated significant economic impact which is global in nature. For example, his predictive algorithms have enabled safer, more productive blast mining for Orica's clients — in one mine alone, software based on Munjiza's modelling approach has meant a 10% increase in productivity, a 7% reduction in costs and an annual saving of $2.8 million. It has also been used in Dassault Systems' Abaqus modelling software, which is the world's leading generic simulation software used to solve a wide variety of industrial problems across the defence, automobile, construction, aerospace and chemicals sectors with associated economic impact.
This study demonstrates how Bayes linear methodologies developed at Durham University have impacted on industrial practice. Two examples are given. The approach has been applied by London Underground Ltd. to the management of bridges, stations and other civil engineering assets, enabling a whole-life strategic approach to maintenance and renewal to reduce costs and increase safety. The approach has won a major award for innovation in engineering and technology. The methodology has also been applied by Unilever and Fera to improve methods of assessing product safety and in particular the risk of chemical ingredients in products causing allergic skin reactions.
Since its launch in 2009, the mobile phone package price comparison tool Billmonitor has identified £35 million worth of savings available to the 110,000 users whose bills have been analysed. It was the first price comparison tool to be accredited by Ofcom and it has been widely praised in the media. Exploiting techniques that they had developed for applications in finance and genetics, University of Oxford researchers Chris Holmes and Nicolai Meinshausen developed the statistical algorithms underpinning the package, which uses simulation-based inference and careful statistical modelling to analyse users' phone bill data. It searches over 2.4 million available packages to identify the best mobile phone deal for each user's particular pattern of usage. Widely quoted in the press, reports in 2011 and 2012 from the Billmonitor team estimated that approximately three quarters of mobile phone customers are on the wrong tariff, with an overspend of around 40%.
The Computational Optimization Group (COG) in the Department of Computing produced new models, algorithms, and approximations for supporting confident decision-making under uncertainty — when computational alternatives are scarce or unavailable. The impact of this research is exemplified by the following:
This impact case study is based on a Knowledge Transfer Partnership (KTP) between the School of Mathematics, Statistics and Actuarial Science, University of Kent and KROHNE Ltd, a world leading manufacturer of industrial measuring instruments. These precision instruments (typically flow meters and density meters) need to be calibrated accurately before being used and this is an expensive and time-consuming process.
The purpose of the KTP was to use Bayesian methodology developed by Kent statisticians to establish a novel calibration procedure that improves on the existing procedure by incorporating historical records from calibration of previous instruments of the same type. This reduces substantially the number of test runs needed to calibrate a new instrument and will increase capacity by up to 50%.
The impact of the KTP, which was graded as `Outstanding', has been to change the knowledge and capability of the Company, so that they can improve the performance of their manufacturing process by implementing this novel calibration method. This has been achieved by adapting the underpinning Kent research to the specific context of the calibration problem, by running many calibrations to demonstrate the effectiveness of the method in practice, and by supporting the implementation of the new calibration method within the Company's core software.
Moreover, the project has changed the Company's thinking on fundamental science, particularly industrial mathematics. The value of historical data, and the usefulness of Bayesian methods, is now widely appreciated and training for staff in Bayesian Statistics is being introduced. Thus the project has not only changed the protocols of the Company, it has also changed their practice.
Since 2008, statistical research at the University of Bristol has significantly influenced policies, practices and tools aimed at evaluating and promoting the quality of institutional and student learning in the education sector in the UK and internationally. These developments have also spread beyond the education sector and influence the inferential methods employed across government and other sectors. The underpinning research develops methodologies and a much-used suite of associated software packages that allows effective inference from complicated data structures, which are not well-modelled using traditional statistical techniques that assume homogeneity across observational units. The ability to analyse complicated data (such as pupil performance measures when measured alongside school, classroom, context and community factors) has resulted in a significant transformation of government and institutional policies and their practices in the UK, and recommendations in Organisation for Economic Co-operation and Development (OECD) policy documents. These techniques for transforming complex data into useful evidence are well-used across the UK civil service, with consequent policy shifts in areas such as higher education admissions and the REF2014 equality and diversity criteria.