Enhancing competitive advantage at Pratt & Whitney using Design for Variation

Submitting Institution

University of Nottingham

Unit of Assessment

Mathematical Sciences

Summary Impact Type

Technological

Research Subject Area(s)

Mathematical Sciences: Statistics
Information and Computing Sciences: Artificial Intelligence and Image Processing
Economics: Econometrics


Download original

PDF

Summary of the impact

Methods of emulation, model calibration and uncertainty analysis developed by Professor Tony O'Hagan and his team at The University of Nottingham (UoN) have formed the basis of Pratt & Whitney's Design for Variation (DFV) initiative which was established in 2008. The global aerospace manufacturers describe the initiative as a `paradigm shift' that aims to account for all sources of uncertainty and variation across their entire design process.

Pratt & Whitney considers their implementation of the methods to provide competitive advantage, and published savings from Pratt & Whitney adopting the DFV approach for a fleet of military aircraft are estimated to be approximately US$1billion.

Underpinning research

The underpinning research is O'Hagan's work on uncertainty quantification, carried out in Nottingham between 1993 and 1998. In particular, O'Hagan (UoN, 1990-1999, Department of Mathematics and School of Mathematical Sciences) explained how to perform analysis of complex computer simulators when computational resources are limited or there is uncertainty about any aspect of the system. The research was motivated by O'Hagan's work with various organisations, including the National Radiological Protection Board [A2], which wished to draw inferences from complex computer simulators.

Computer simulators used to make predictions usually have the following characteristics: they i) rely upon unknown parameter values; ii) are imperfect representations of the physical system they are modelling; and iii) are inherently deterministic, i.e., there is no natural variability, unlike in physical systems. O'Hagan and his team (including PhD students Haylock, Kennedy and Oakley, and then PDRA Kennedy) developed a range of techniques for analysing computer simulators which are widely used by industry and scientific researchers.

The first of these methods is an approach to calibration, namely how to estimate fixed but unknown input parameters. This work appeared in papers [A1, A2] published after O'Hagan had left UoN in January 1999, but which were initially published as UoN technical reports in 1998 as a result of an EPSRC-funded project [A5]. In particular, [A1] is an important Royal Statistical Society discussion paper that has had wide-ranging impact (over 460 citations according to Scopus in journals covering mathematics, engineering, computer science, as well as environmental, decision, earth, agricultural and biological sciences and other fields). The key idea is that if a simulator is imperfect then this imperfection must be modelled to learn anything meaningful about the model parameters. O'Hagan's research showing how to do this for complex simulators using Gaussian processes is the starting point for looking at quantifying simulator discrepancy.

The second method is for undertaking uncertainty and sensitivity analysis in complex simulators [A1, A3], which is the problem of how to propagate uncertainty in initial conditions and parameters through the simulator to find the corresponding uncertainty in the predictions. Imperfect knowledge of parameter values and lack of natural variability in the simulator mean that this is often (along with model error) the main source of uncertainty in predictions; computationally expensive simulators make Monte Carlo approaches infeasible.

O'Hagan showed how to build and use emulators of simulators to solve the calibration and uncertainty/ sensitivity analysis problems [A1, A2, A3, A4]; emulators are statistical models of the computer simulator that can be used as surrogate models to perform the inference that would be too computationally expensive to do with the full simulator. He showed how to build Gaussian process emulators within a Bayesian framework [A2, A4], and demonstrated how they can be used in highly complex problems.

The statistical design paradigm used by Pratt & Whitney is very clearly dependent on, both in spirit and in the specifics, the uncertainty quantification approach developed and advocated by O'Hagan during his time in Nottingham. Pratt & Whitney make extensive use of emulators; they use both the approach in [A1] for simulator calibration and the sensitivity analysis approach that was developed from [A2] and [A3]. Reinman et al. [B4] also cites other papers that rely heavily upon (and cite) the work of O'Hagan during his time in Nottingham.

References to the research

The three publications that best indicate the quality of the research are indicated *

[A1]* Kennedy, M. C. and O'Hagan, A. (2001). Bayesian calibration of computer models (with discussion). Journal of the Royal Statistical Society B 63, 425-464.
DOI: 10.1111/1467-9868.00294

 
 

Funded by [A5]. Initially released as Nottingham Statistics Group Technical Report 1998-10.

[A2]* O'Hagan, A., Kennedy, M. C. and Oakley, J. E. (1999). Uncertainty analysis and other inference tools for complex computer codes (with discussion). In Bayesian Statistics 6, J. M. Bernardo et al. (eds.). Oxford University Press, 503-524.
ISBN-10: 0198504853; ISBN-13: 978-0-19-850485-6 (available on request)

This conference paper provides most of the theory that is eventually published in [A1, A3, A4].

[A3]* Oakley, J. and O'Hagan, A. (2002). Bayesian inference for the uncertainty distribution of computer model outputs. Biometrika, 89, 769-784.
DOI: 10.1093/biomet/89.4.769

 
 
 
 

Initially released as Nottingham Statistics Group Technical Report 1998-11.

[A4] Kennedy, M.C. and O'Hagan, A. (2000). Predicting the output from a complex computer code when fast approximations are available. Biometrika 87, 1-13. DOI: 10.1093/biomet/87.1.1

 
 

Funded by [A5] and first released as Nottingham Statistics Group Technical Report 1998-09.

Grants

[A5] Bayesian uncertainty analysis and calibration of complex computer models, PI O'Hagan, EPSRC grant GR/K54557/01, 1 October 1995 — 30 September 1998, £114,666.

Employed Mark Kennedy as a research associate and led to the papers [A1, A2, A3, A4].The foundations for the grant were the PhD project of Haylock and associated work by O'Hagan.

Details of the impact

Pratt & Whitney are one of the `big three' global aerospace manufacturers whose engines power more than 25% of the world's mainline passenger fleet. They reported revenue of US$14 billion in the year 2012 [B1]. In 2003, the United Technologies Company (UTC — Pratt & Whitney's parent company) technical advisory committee issued a challenge to require all engineering analyses to have an associated uncertainty (covering accuracy and precision) and range of applicability. In response, Pratt & Whitney undertook the Uncertainty Quantification Initiative which, in 2008 [B2], became the Design for Variation (DFV) Strategic Initiative [B3]. This initiative moved the design process within the company from a deterministic to a probabilistic design framework, and is described in a paper written by 18 Pratt & Whitney employees [B4] as a `paradigm shift' in their engineering approach. Their introduction states that

"Much of the analytical structure of DFV is derived from the Kennedy and O'Hagan (2001) [A1] paper on Bayesian model calibration. Kennedy and O'Hagan developed models and methods for Bayesian model calibration which, in addition to calibrating model parameters, quantify systematic and random discrepancies between model and data."

The main impacts of DFV at Pratt & Whitney are:

  1. Increasing the time that an engine stays "on wing". This is directly related to reducing important sources of variation, thus aircraft availability or readiness can be managed and improved using DFV technologies [B4].
  2. Improved identification of cost-reduction opportunities. DFV technologies are used to highlight design and process features that have little impact on the part or system performance but are expensive to maintain [B4].

Since DFV's inception, the company has realised a range of benefits, including increased speed of design studies and optimisation, root cause investigations using engineering model emulators, and improvement of quality systems through identification of inspectable characteristics that are more highly correlated with service performance [B3, B4].

DFV has catalysed a shift in design paradigm across Pratt & Whitney, and has involved quantifying all of the uncertainties in their design and simulation process. Grant Reinman (senior statistician and DFV leader) based the company's approach upon [A1], and in 2011/12 he gave a series of conference talks (including at NASA and the Isaac Newton Institute [B5]) describing how the methods of O'Hagan have been successfully implemented at Pratt & Whitney. Al Brockett, former vice president at Pratt & Whitney, recently described how DFV has "changed from a special initiative focused on statistical training to a high-visibility strategic priority" for the company [B3]. This successful and large investment of time and money is the most complete demonstration of the impact of the work developed by O'Hagan at UoN.

Pratt & Whitney estimates "that its component-level DFV initiatives have yielded a 64 percent to 88 percent return on investment by reducing design iterations, improving manufacturability, increasing reliability, improving on-time deliveries, and providing other performance benefits. As Pratt & Whitney focuses increasingly on the systems level, it estimates that it will realize a 40-times return on investment by achieving systems-level reliability goals much earlier in the development cycle." [B3, see also B5]. A Business Case Study undertaken at Pratt & Whitney quantifies savings in costs for a large fleet of military aircraft at approximately US$1 billion [B6]. Applying DFV is estimated to save 18-37% of scrap and rework costs and 80% of the engineering support costs associated with turbine aerofoils that do not meet final air flow requirements [B4].

The scale of a typical DFV project is highly proprietary to Pratt & Whitney, but it has involved the creation of a new group within the company (Parametric Modeling, Design Automation, Optimization and DFV), and extensive training of a large number of staff across the entire company in UTC's degree programme [B3, B6, B7]. DFV has grown into a core competency, and is applied as a 10-step process that guides all engineering activities [B3]. Pratt & Whitney has created hundreds of internal courses and over 200 engineers have taken the advanced emulation and calibration classes, and 5 engineers have taken complete graduate degrees in statistics [B6]. Since the start of the DFV initiative in 2008 [B2], 32 different key modelling processes have been DFV enabled [B4], ranging through the entire engine design process, from fan, compressor and turbine design, to performance analysis and engine validation.

The Uncertainty Quantification methodology is considered by Pratt & Whitney to be part of their competitive advantage, as controlling variation has become one of the keys to improving performance while also improving part yield and quality [B3]. Other key competitors and collaborators (e.g. General Electric, Airbus) are beginning to use similar methods actively in their design processes [B8].

Sources to corroborate the impact

[B1] Pratt & Whitney website www.pw.utc.com/Who_We_Are (copy also filed 5 August 2013)

[B2] Pratt & Whitney's history of corporate quality initiatives (2012). A time-line of the dates of the DFV initiative. Pratt & Whitney document. (copy on file)

[B3] Al Brockett interview, ANSYS Advantage Magazine, vol VII, issue 2, pp 16-21 (2013). http://www.ansys.com/About+ANSYS/ANSYS+Advantage+Magazine (copy also on file)

[B4] Reinman, G., Ayer, T., Davan, T., Devore, M., Finley, S., Glanovsky, J., Gray, L., Hall, B., Jones, C., Learned, A., Mesaros, E., Morris, R., Pinero, S., Russo, R., Stearns, E., Teicholz, M., Teslik-Welz, W. and Yudichak, D., (2012). Design for Variation, Quality Engineering, 24:317-345. DOI:10.1080/08982112.2012.651973 (copy also on file)

Many of the other statistics papers they cite (besides the O'Hagan papers) are direct extensions of the approaches developed by O'Hagan during his time in Nottingham (such as Higdon et al. 2008, and Santner et al. 2003, Williams et al. 2006).

[B5] Reinman G., Design for Variation, invited conference presentation, Uncertainty in Computer Models, Sheffield 2012. www.mucm.ac.uk/UCM2012/Forms/Downloads/Reinman.pptx (copy also on file)

Versions of this talk were also given at NASA and the Isaac Newton Institute (amongst other places) during 2011.

[B6] Senior Statistician, Pratt & Whitney, Connecticut, USA (email on file).

[B7] http://bits.blogs.nytimes.com/2012/01/31/a-1-billion-model-employee-education-program/ (copy also filed 5 August 2013)

[B8] http://www.stirling-dynamics.com/dipart-loads-and-aeroelastics-news-and-events/uncertainty-quantification-and-management-workshop" (copy also filed 5 August 2013)