Log in
Through a close collaboration with Ford Motor Company, simulation modelling software developed at the University of Southampton has streamlined the design of the car giant's engine production lines, increasing efficiency and delivering significant economic benefits in three key areas. Greater productivity across Ford Europe's assembly operations has generated a significant amount [exact figure removed] in direct cost savings since 2010. Automatic analysis of machine data has resulted in both a 20-fold reduction in development time, saving a large sum per year [exact figure removed], and fewer opportunities for human error that could disrupt the performance of production lines costing a large sum [exact amount removed] each to program.
Small area estimation (SAE) describes the use of Bayesian modelling of survey and administrative data in order to provide estimates of survey responses at a much finer level than is possible from the survey alone. Over the recent past, academic publications have mostly targeted the development of the methodology for SAE using small-scale examples. Only predictions on the basis of realistically sized samples have the potential to impact on governance and our contribution is to fill a niche by delivering such SAEs on a national scale through the use of a scaling method. The impact case study concerns the use of these small area predictions to develop disease-level predictions for some 8,000 GPs in England and so to produce a funding formula for use in primary care that has informed the allocation of billions of pounds of NHS money. The value of the model has been recognised in NHS guidelines. The methodology has begun to have impact in other areas, including the BIS `Skills for Life' survey.
Wavelets and multiscale methods were introduced and rapidly became popular in scientific academic communities, particularly mathematical sciences, from the mid-1980s. Wavelets are important because they permit more realistic modelling of many real-world phenomena compared to previous techniques, as well as being fast and efficient. Bristol's research into wavelets started in 1993, has flourished and continues today. Multiscale methods are increasingly employed outside academia. Examples are given here of post-2008 impact in central banking, marketing, finance, R&D in manufacturing industry and commercial software, all originating from research at Bristol. Much of the impact has been generated from the original research via software. This software includes freeware, distributed via international online repositories, and major commercial software, such as Matlab (a preeminent numerical computing environment and programming language with over one million users worldwide).
With global demand for energy ever increasing, environmental impact has become a major priority for the oil industry. A collaboration between researchers at the University of Glasgow and Shell Global Solutions has developed GWSDAT (GroundWater Spatiotemporal Data Analysis Tool). This easy-to-use interactive software tool allows users to process and analyse groundwater pollution monitoring data efficiently, enabling Shell to respond quickly to detect and evaluate the effect of a leak or spill. Shell estimates that the savings gained by use of the monitoring tool exceed $10m over the last three years. GWSDAT is currently being used by around 200 consultants across many countries (including the UK, US, Australia and South Africa) with potentially significant impacts on the environment worldwide.
Research conducted in UCL's Department of Statistical Science has led to the development of a state-of-the-art software package for generating synthetic weather sequences, which has been widely adopted, both in the UK and abroad. The synthetic sequences are used by engineers and policymakers when assessing the effectiveness of potential mitigation and management strategies for weather-related hazards such as floods. In the UK, the software package is used for engineering design; for example, to inform the design of flood defences. In Australia it is being used to inform climate change adaptation strategies. Another significant impact is that UCL's analysis of rainfall trends in southwest Western Australia directly supported the decision of the state's Department of Water to approve the expansion of a seawater desalination plant at a cost of around AUS$450 million. The capacity of the plant was doubled to 100 billion litres per year in January 2013 and it now produces nearly one third of Perth's water supply.
Researchers in Cambridge have developed a data standard for storing and exchanging data between different programs in the field of macromolecular NMR spectroscopy. The standard has been used as the foundation for the development of an open source software suite for NMR data analysis, leading to improved research tools which have been widely adopted by both industrial and academic research groups, who benefit from faster drug development times and lower development costs. The CCPN data standard is an integral part of major European collaborative efforts for NMR software integration, and is being used by the major public databases for protein structures and NMR data, namely Protein Data Bank in Europe (PDBe) and BioMagResBank.
Reversible Jump Markov chain Monte Carlo, introduced by Peter Green [1] in 1995, was the first generic technique for conducting the computations necessary for joint Bayesian inference about models and their parameters, and it remains by far the most widely used, 18 years after its introduction. The paper has been (by September 2013) cited over 3800 times in the academic literature, according to Google Scholar, the vast majority of the citing articles being outside statistics and mathematics. This case study, however, focusses on substantive applications outside academic research altogether, in the geophysical sciences, ecology and the environment, agriculture, medicine, social science, commerce and engineering.
The WinBUGS software (and now OpenBUGS software), developed initially at Cambridge from 1989-1996 and then further at Imperial from 1996-2007, has made practical MCMC Bayesian methods readily available to applied statisticians and data analysts. The software has been instrumental in facilitating routine Bayesian analysis of a vast range of complex statistical problems covering a wide spectrum of application areas, and over 20 years after its inception, it remains the leading software tool for applied Bayesian analysis among both academic and non-academic communities internationally. WinBUGS had over 30,000 registered users as of 2009 (the software is now open-source and users are no longer required to register) and a Google search on the term `WinBUGS' returns over 205,000 hits (over 42,000 of which are since 2008) with applications as diverse as astrostatistics, solar radiation modelling, fish stock assessments, credit risk assessment, production of disease maps and atlases, drug development and healthcare provider profiling.
Visual analytics is a powerful method for understanding large and complex datasets that makes information accessible to non-statistically trained users. The Non-linearity and Complexity Research Group (NCRG) developed several fundamental algorithms and brought them to users by developing interactive software tools (e.g. Netlab pattern analysis toolbox in 2002 (more than 40,000 downloads), Data Visualisation and Modelling System (DVMS) in 2012).
Industrial products. These software tools are used by industrial partners (Pfizer, Dstl) in their business activities. The algorithms have been integrated into a commercial tool (p:IGI) used in geochemical analysis for oil and gas exploration with a 60% share of the worldwide market.
Improving business performance. As an enabling technology, visual analytics has played an important role in the data analysis that has led to the development of new products, such as the Body Volume Index, and the enhancement of existing products (Wheelright: automated vehicle tyre pressure measurement).
Impact on practitioners. The software is used to educate and train skilled people internationally in more than 6 different institutions and is also used by finance professionals.
R is a free and open-source software programming language and software environment for expressing and implementing statistical algorithms and graphics. It has become the lingua franca for developing and implementing new statistical methodologies — not just in statistics, but in applications across the whole spectrum of industry, from marketing and pharmaceuticals to finance. It is used by companies for research, analysis and production. Its power in analysing and visualising data helps organisations from charities to government. About one half of the core statistical modelling and graphics engine included in R builds on research carried out in Oxford.