Evidence and causes: impact on practice

Submitting Institution

London School of Economics & Political Science

Unit of Assessment

Philosophy

Summary Impact Type

Societal

Research Subject Area(s)

Medical and Health Sciences: Clinical Sciences, Public Health and Health Services
Psychology and Cognitive Sciences: Psychology


Download original

PDF

Summary of the impact

Nancy Cartwright has undertaken highly influential research on causation, evidence for causes, and the generalizability of evidence from one context to another. This work influenced a number of bodies looking into practical issues of evidence and policy-making both in the UK and the US and led to her being extensively consulted by those bodies — consultations which led to direct impacts on policy. This case-study concentrates on three Impacts from the UK: (a) a Department for International Development [DFID] Study Group aimed at improving methods for evaluating the effectiveness of aid interventions; (b) a body consisting of a number of institutions concerned with mental health aimed at influencing the way NICE evaluates evidence for psychological ("talking") therapies; and (c) Professor Eileen Munro's investigation into Child Care in the UK, leading to her extremely influential Report.

Underpinning research

Research Insights and Outputs:

Within the LSE Philosophy Department there is a long-standing tradition of research into the methodology of clinical trials, the epistemic virtues and limitations of randomization (RCTs), and the general issue of evidence in medicine. This research goes back to Peter Urbach and was continued by John Worrall. Nancy Cartwright, as well as developing her own slant on the initial issues, has importantly extended the discussion in a number of ways; in particular by extending the analysis (a) beyond medicine to the social sciences where RCT methodology has also become particularly prominent; and thence (b) to the area of the impact of evidence (whether from RCTs or other research studies) on policy [References 1-5].

There are a number of innovations in Cartwright's approach, but the central contribution of interest here concerns the issue of the generalizability of evidence for the effectiveness of some proposed treatment from the precise conditions of a clinical trial to the invariably messier situation in which we would like to apply the treatment. Her work argues that, even supposing that randomizing gives us perfect information about the effectiveness of an intervention within a trial, it by no means follows (either in medicine or in the social realm) that that intervention will work `for us' — i.e. in some non-trial situation to which we might apply it. She highlights a number of cases in which the results of trials failed (often spectacularly) to generalize.

This problem has often been discussed under the heading of `external validity'. Cartwright's original contribution is best understood against the background of her detailed and influential analysis of causation in terms of `capacities'. Cartwright shows what conditions must be assumed to hold if indeed some experimental result is to generalize in the appropriate way; shows how robust those assumptions are; and indicates carefully the sorts of evidence that we need if we are to have any grounds for supposing that those assumptions do indeed hold. She analyses what it means for a factor to have a `stable capacity to promote some outcome' and argues that we have grounds for holding that evidence is likely to generalise only if that evidence includes evidence for some stable capacity. She shows that no RCT on its own can establish (or even provide strong evidence for) the existence of stable capacities (and nor can any other single investigation in the tradition of Mill's method of difference). She also shows in detail the sorts of epistemic problems that need to be overcome if we are to claim evidence for capacities. Her conclusion is that there is no escaping the reliance on background theories. This runs contrary to the underlying view of RCT-advocates who (erroneously) see the chief virtue of RCTs as that of ruling out all possible confounders and hence being independent of theory.

Cartwright's work, then, has important implications for policy both in medicine and in social science and policy: having grounds to think that some intervention will work in the way its proponents would like it to can never be a question of just doing sufficiently many and sufficiently careful RCTs (or Mill's methods studies more generally). We must be aware of the difficulties and complexities involved in the `will it work for us' issue and, where necessary, learn to handle uncertainty. This has helped promote a much more nuanced view of evidence in these areas.

Key Researchers: All of the work and impact cited here was carried out while Cartwright was a staff member at LSE from 1991-2012. Professor Cartwright joined Durham University in Summer 2012.

References to the research

[1] Cartwright, N. Hunting Causes and Using Them: Approaches in Philosophy and Economics. Cambridge University Press, 2007. LSE Research Online no: 20464

 
 
 

[2] Cartwright, N. Evidence Based Policy: A Practical Guide to Doing it Right. (with Jeremy Hardie) Oxford University Press, 2012. LSE Research Online no: 51407

 
 
 

[3] Cartwright, N. `Evidence-Based Policy: What's to be done about Relevance?' Philosophical Studies 143 (1), 127-136, 2009. DOI number: 1007/s11098-008-9311-4

 
 

[4] Cartwright, N. `What are Randomised Controlled Trials Good For?' Philosophical Studies 147 (1),59-70, 2010. DOI number: 10.1007/s11098-009-9450-2

 
 
 
 

[5] Cartwright, N. `Will This Policy Work for You? Predicting Effectiveness Better: How Philosophy Helps' Philosophy of Science, 79 (5), 973-989, 2012. LSE Research Online no: 37070

 
 

Evidence of quality: all publications are in top journals or with high quality university presses.

Details of the impact

[a] Overseas Development Assistance

Cartwright's work brought her to the attention of a number of bodies concerned with policy (and with the right ways for evidence to impact on policy) in the US as well as in the UK. For example, a study was commissioned in 2011 by the DFID (Department for International Development) into `Broadening the Range of Designs and Methods for Impact Evaluations' [6] with the particular impact at issue here being that of aid initiatives. Cartwright was extensively consulted as an advisor to the study team headed by Elliot Stern. The Study Group published their DFID Report in 2012. Stern [7] wrote to Cartwright on publication: `I am grateful for your inputs both active (in your very helpful comments) and indirectly by informing us through your writings.' There are 14 references to Cartwright's work (often leading into extensive presentations of her views) in the Report [6] especially to her [1] and [2]]; the Appendix, characterised as a `Background Paper' to the Report and with the title `Models of Causality and Causal Inference` draws almost entirely and in detail on her approach; section 3.62 entitled `The `strength' of causal influence' draws on her distinction between `clinchers' and `vouchers' [Cartwright [1])]; and section 4.5 is entitled `Impact Evaluation Question 4: Will the Intervention Work Elsewhere?' which is the central question that Cartwright is addressing in her work. Donor agencies (including the World Bank, OECD, and country donors such as DFID) have organised conferences discussing the report's findings and have used its findings in staff training. Stern co-produced a report for the Research Program on Aquatic Agricultural systems (part of the Collaborative Group on International Agricultural Research), which explicitly cites Cartwright and draws on her analysis of causation and evidence (see [8]). Through the DFID report, Michael Woolcock, Lead Social Development Specialist at the World Bank's Development Research Group, has become influenced by Cartwright's views — a recent working paper [9] makes much of Cartwright and Hardie's book. And Cartwright's ideas are influencing researchers at the World Bank more generally (see, e.g., [10]).

[b] Mental Health Care Provision

The New Savoy Partnership is a grouping of organisations — including the Royal College of Psychiatrists, the United Kingdom Council for Psychotherapy (UKCP), Mind and several others — aimed at improving mental health care provision in the UK. It had noted that concerns had been raised by a Health Select Committee Inquiry in 2007 about the evaluation methodology used by NICE (the National Institute for Clinical Excellence) for `talking therapies'. The Partnership determined in 2011 to return to this issue with a view to affecting NICE procedures directly. A working party was set up with Cartwright as a member. This led to a roundtable discussion which produced a series of questions to be raised at a Keynote Panel discussion at the Partnership's Conference in September 2011 — this panel was chaired by Sir Michael Rawlins, then Head of NICE and included Cartwright (see [11]).

The views of the working party were published in a report by UKCP [12]. In line with Cartwright's views, this report suggests `that an over-reliance on RCT evidence is likely to impair and distort guideline recommendations for psychological therapies'. It points out ([12], p.5) that `what works in a particular place [may be] less good at predicting whether something would work for other people in other places; and suggests that `studies need to focus much more on how treatments work (mechanisms of change) and the factors that support or hinder them in different ... contexts' (pp. 5-6). These points are at the core of Cartwright's work. The group subsequently agreed a `consensus statement on evidence' [13] submitted to NICE as a proposed `starting point for NICE to review its approach'. Cartwright was one of 9 initial signatories of this statement. This in turn informed the submission made by UKCP to the Health Select Committee Inquiry into NICE in Autumn 2012 which called, inter alia, for a `more pluralistic approach to the evidence base around effective treatments in relation to mental health'. At that 2012 Health Select Committee Inquiry the incoming head of NICE, Professor David Haslam, committed himself to a review of the way NICE assesses the effectiveness of psychological therapies — discussions with NICE are on-going and changes to their Public Health Guidelines in line with UKCP's proposals are expected ([14]).

[c] The Munro Report

Professor Eileen Munro from the LSE's Social Policy Department became interested in Cartwright's work on causes and evidence, and was one of the researchers on the project `Evidence for Use' led by Cartwright and housed at the LSE's CPNSS. The two did research together leading to a joint paper in 2010 ([15]). This paper shows the extent to which Munro's views on evidence for the effectiveness of intervention were shaped by Cartwright's. It considers the case of multisystemic therapy, `an internationally adopted intervention to try to diminish antisocial behaviour in young people'. It argues, following Cartwright's general line, that evidence for the effectiveness of this therapy has incorrectly been sought in terms of RCTs and the like, whereas the issue of whether the therapy will work in practice requires a much more subtle view of the evidence — essentially looking to build up evidence for the existence of a stable capacity. ([15], pp.260-3)

At the time that this joint research was published (2010), Munro was asked by the Secretary of State for Education to review the UK child protection system in the wake of `Baby P' and other cases. She produced her final Report in 2011 ([16]). Unsurprisingly the approach to evidence that Munro took in producing that Report reflects Cartwright's influence. According to Munro ([17]), Cartwright's work, `helped form ... the whole approach [taken in the Report] to the real role of research findings in [the] area and especially to the role of expertise and expert reasoning'. And she further attests (ibid.):

The Report, in particular chapter 6, and even more especially sections 6.31 to 6.39 bear the direct hallmarks of Cartwright's analysis. It would have been very different without her impact that helped me clarify my thinking on the role and limitations of research findings and to have confidence to challenge the dominant version of evidence-based practice. These sections have the title `Using Evidence' and then cover a range of types of evidence. Without Cartwright's influence on my work, I would have gone along with the dominant usage where `evidence' is being equated with "empirical research findings".

The Report made 15 recommendations, all of which were accepted by the Government ([18]), thus producing major changes in the education of social workers involved in child care, the appraisal of their work and impact, and the child care system in general ([19], [20]). The Report has also had impact beyond the UK. For example, Munro was asked to give evidence to two state government reviews of child care in Australia and a charity in Queensland is running a campaign to persuade the state government to learn from her work.

Sources to corroborate the impact

All sources listed below can also be seen at: https://apps.lse.ac.uk/impact/case-study/view/78

[6] Broadening the Range of Design and Methods for Impact Development. Department for International Development, Working paper 38, April 2102
https://apps.lse.ac.uk/impact/download/file/883

[7] Email from Head of the DFID Study Group, 19th May 2012. This source is confidential.

[8] Email from Head of the DFID Study Group, 28th May 2013. This source is confidential.

[9] Michael Woolcock `Using case studies to explore the external validity of `complex' development interactions' WIDER Working Paper No 2013/096. October 2013.
http://www.wider.unu.edu/publications/working-papers/2013/en_GB/wp2013-096/

[10] World Bank blog on development impact http://blogs.worldbank.org//impactevaluations/why-similarity-wrong-concept-external-validity Source files:
https://apps.lse.ac.uk/impact/download/file/1594

[11] Letter from Head of UKCP, 16 July 2012. This source is confidential.

[12] Liz McDonnell `Looking at the evidence: a discussion of how NICE assesses talking therapies' UKCP Research Faculty Committee Report, 2012 https://apps.lse.ac.uk/impact/download/file/1595

[13] The New Savoy Partnership Consensus Statement
(http://www.newsavoypartnership.org/consensus-statement.htm)

[14] UKCP evidence to the Health Select Committee inquiry into NICE, Autumn 2012 and news of reaction from NICE : http://www.psychotherapy.org.uk/nice_campaign.html [15] Cartwright, ND and Munro, E (2010) `The limitations of randomized controlled trials in predicting effectiveness,' Journal of evaluation of clinical practice, 16(2), 260-266. ISSN 1356-1294
https://apps.lse.ac.uk/impact/download/file/1596

[16] Munro, E (2011) The Munro Review of Child Protection, Final Report: A Child-centred System. London. Department for Education. https://www.gov.uk/government/publications/munro-review-of-child-protection-final-report-a-child-centred-system Source files:
https://apps.lse.ac.uk/impact/download/file/1597

[17] Letter from Professor Eileen Munro, dated 21/05/2013. This source is confidential.

[18] Confirmation of acceptance of the report's recommendations by UK government https://www.gov.uk/government/publications/a-child-centred-system-the-governments-response-to-the-munro-review-of-child-protection Source files: https://apps.lse.ac.uk/impact/download/file/1599

[19] Munro, E. 2012 Progress Report: Moving towards a child-centred system, London, DfE. https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/180861/DFE-00063-2012.pdf Source files: https://apps.lse.ac.uk/impact/download/file/1600

[20] Munro, E. 2013 Working together to Safeguard Children, 2013, London, DfE. (Implementation of Recommendations) http://media.education.gov.uk/assets/files/pdf/w/working%20together.pdf