Log in
Research carried out from 2003 by Currie (Maxwell Institute) and his PhD students Djeundje, Kirkby and Richards (also Longevitas), and international collaborators Eilers and Durban, created new, flexible smoothing and forecasting methods. These methods are now widely used by insurance and pension providers to forecast mortality when determining pricing and reserving strategy for pensions. The methods were incorporated by the SME Longevitas in its forecasting package Projections Toolkit launched in 2009. This generated impact in the form of £400K turnover for Longevitas in licensing and consultancy fees, with further impact on the pricing and reserving strategies on Longevitas's customers. Since 2010 the methods have been adopted by the Office for National Statistics (ONS) to make the forecasts required to underpin public policy in pensions, social care and health and by The Continuous Mortality Investigation (CMI) to model and provide forecasts on mortality to the pensions and insurance industries. As a result, the research has changed practices in these advisory agencies and in the insurance industry.
Advanced technologies for data visualisation and data mining, developed in the Unit in collaboration with national and international teams, are widely applied for development of medical services. In particular, a system for canine lymphoma diagnosis and monitoring developed with [text removed for publication] has now been successfully tested using clinical data from several veterinary clinics. The risk maps produced by our technology provide early diagnosis of lymphoma several weeks before the clinical symptoms develop. [text removed for publication] has estimated the treatment test, named [text removed for publication], developed with the Unit to add [text removed for publication] to the value of their business. Institute Curie (Paris), applies this data mapping technique and the software that has been developed jointly with Leicester in clinical projects.
Small area estimation (SAE) describes the use of Bayesian modelling of survey and administrative data in order to provide estimates of survey responses at a much finer level than is possible from the survey alone. Over the recent past, academic publications have mostly targeted the development of the methodology for SAE using small-scale examples. Only predictions on the basis of realistically sized samples have the potential to impact on governance and our contribution is to fill a niche by delivering such SAEs on a national scale through the use of a scaling method. The impact case study concerns the use of these small area predictions to develop disease-level predictions for some 8,000 GPs in England and so to produce a funding formula for use in primary care that has informed the allocation of billions of pounds of NHS money. The value of the model has been recognised in NHS guidelines. The methodology has begun to have impact in other areas, including the BIS `Skills for Life' survey.
There is growing evidence that official population statistics based on the decennial UK Census are inaccurate at the local authority level, the fundamental administrative unit of the UK. The use of locally-available administrative data sets for counting populations can result in more timely and geographically more flexible data which are more cost-effective to produce than the survey-based Census. Professor Mayhew of City University London has spent the last 13 years conducting research on administrative data and their application to counting populations at local level. This work has focused particularly on linking population estimates to specific applications in health and social care, education and crime. Professor Mayhew developed a methodology that is now used as an alternative to the decennial UK Census by a large number of local councils and health care providers. They have thereby gained access to more accurate, detailed and relevant data which have helped local government officials and communities make better policy decisions and save money. The success of this work has helped to shape thinking on statistics in England, Scotland and Northern Ireland and has contributed to the debate over whether the decennial UK Census should be discontinued.
This case study concerns the development and subsequent uptake of the Feature Selective Validation (FSV) method for data comparisons. The method has been adopted as the core of IEEE Standard 1597.1: a `first of its kind' standard on validation of computational electromagnetics and is seeing increasingly wide adoption in industry practice where comparison of data is needed, indicating the reach and significance of this work. The technique was developed by, and under the guidance of, Dr Alistair Duffy, who has remained the world-leading researcher in the field. The first paper on the subject was published in 1997 with key papers being published in 2006.
This case study demonstrates the benefits achieved when the mathematical and computational aspects of a computational fluid dynamics (CFD) problem were brought together to work on real-world aerodynamic applications. While earlier insight on the solution reconstruction problem was purely based on empirical intuition, research in the School of Mathematics at the University of Birmingham by Dr Natalia Petrovskaya has resulted in the development of the necessary synthetic judgement in which the importance of accurate reconstruction on unstructured grids has been fully recognised by the CFD researchers at the Boeing Company. Boeing has confirmed that the research has led to substantial resultant improvements in their products as well as gains in engineering productivity. For instance, wing body fairing and winglets optimization for the Boeing 787 has been done by means of CFD only. Implementation of CFD in the design of their new aircraft allowed Boeing to reduce the testing time in the wind tunnel for the 787 aircraft by 30% in comparison with testing carried out for Boeing 777. Efficient use of CFD in the design of new aircrafts has helped the Boeing Company to further strengthen their core operations, improve their execution and competitiveness and leverage their international advantage.
Targeted Projection Pursuit (TPP) — developed at Northumbria University — is a novel method for interactive exploration of high-dimension data sets without loss of information. The TPP method performs better than current dimension-reduction methods since it finds projections that best approximate a target view enhanced by certain prior knowledge about the data. "Valley Care" provides a Telecare service to over 5,000 customers as part of Northumbria Healthcare NHS Foundation Trust, and delivers a core service for vulnerable and elderly people (receiving an estimated 129,000 calls per annum) that allows them to live independently and remain in their homes longer. The service informs a wider UK ageing community as part of the NHS Foundation Trust.
Applying our research enabled the managers of Valley Care to establish the volume, type and frequency of calls, identify users at high risk, and to inform the manufacturers of the equipment how to update the database software. This enabled Valley Care managers and staff to analyse the information quickly in order to plan efficiently the work of call operators and social care workers. Our study also provided knowledge about usage patterns of the technology and valuably identified clients at high risk of falls. This is the first time that mathematical and statistical analysis of data sets of this type has been done in the UK and Europe.
As a result of applying the TPP method to its Call Centre multivariate data, Valley Care has been able to transform the quality and efficiency of its service, while operating within the same budget.
Through a close collaboration with Ford Motor Company, simulation modelling software developed at the University of Southampton has streamlined the design of the car giant's engine production lines, increasing efficiency and delivering significant economic benefits in three key areas. Greater productivity across Ford Europe's assembly operations has generated a significant amount [exact figure removed] in direct cost savings since 2010. Automatic analysis of machine data has resulted in both a 20-fold reduction in development time, saving a large sum per year [exact figure removed], and fewer opportunities for human error that could disrupt the performance of production lines costing a large sum [exact amount removed] each to program.
With global demand for energy ever increasing, environmental impact has become a major priority for the oil industry. A collaboration between researchers at the University of Glasgow and Shell Global Solutions has developed GWSDAT (GroundWater Spatiotemporal Data Analysis Tool). This easy-to-use interactive software tool allows users to process and analyse groundwater pollution monitoring data efficiently, enabling Shell to respond quickly to detect and evaluate the effect of a leak or spill. Shell estimates that the savings gained by use of the monitoring tool exceed $10m over the last three years. GWSDAT is currently being used by around 200 consultants across many countries (including the UK, US, Australia and South Africa) with potentially significant impacts on the environment worldwide.
Recent food crises show the importance of having effective means of food identification and analysis. Many tests have been developed to monitor food, but analysis of the resulting data is highly problematic. Mathematical techniques developed by Dr Julie Wilson at the University of York allow complex mixtures to be analysed and interpreted. They have enabled the Food and Environment Research Agency (Fera) to maximize the information available from food testing, resulting in improved food safety and authentication worldwide, and underpin the analytical testing services delivered by Fera. The techniques have been incorporated into a bespoke Matlab based solution which is now routinely used by Fera's Chemical and Biochemical Profiling section in the specialist testing services which Fera provides across the food storage and retail, agri-environment and veterinary sectors to over 7,500 customers in over 100 countries. In addition, the techniques are used in Fera's research, supporting around £8M worth of work to develop a wide range of global applications including the determination of disease-related biomarkers, contaminant detection, food traceability and the development of drought- and disease-resistant crop varieties.