Scientific facts and public knowledge: ensuring quality and integrity

Submitting Institution

London School of Economics & Political Science

Unit of Assessment

History

Summary Impact Type

Societal

Research Subject Area(s)

Psychology and Cognitive Sciences: Cognitive Sciences
Language, Communication and Culture: Linguistics


Download original

PDF

Summary of the impact

In February 2010, the Dutch government and parliament were rocked by a serious factual mis- reporting from the UN Intergovernmental Panel on Climate Change (IPCC) about the danger of flooding in the Netherlands. The Netherlands Environmental Assessment Agency, faced with the immense task of checking that there were no more errors in the report, came to LSE researchers for advice. Academics working on LSE's How Well Do Facts Travel? project helped the Agency to establish a process to ensure the integrity of climate-science facts in an efficient and effective manner. That Agency has now extended the method to ensure the integrity of the facts reported in the next generation of IPCC reports (one completed, others forthcoming).

Thus the LSE research, which investigated the histories of how, why and when facts travel with integrity, has been used to improve the quality of scientific evidence in public policy formation about one of the major challenges facing society, that of climate change.

Underpinning research

Research Insights and Outputs:

The How Well Do Facts Travel? project developed a framework for understanding the circulation of pieces of reliable knowledge — facts — that would hold good for both the humanities and the sciences. The substantive research of the project clustered around two issues in assessing the circulation of knowledge. One was clarifying `travelling well' in terms of facts maintaining sufficient `integrity' as they travelled and their `fruitfulness', as evidenced by their being used in other times, places, and contexts. The other issue was concerned with the many kinds of `good company' that were necessary to get facts to travel well — `chaperones' such as names of producing scientists, packaging such as case studies, vehicles such as models or the internet, and so forth [1].

This was a team project, in which individual scholars studied how factual knowledge was circulated around different communities through a wide variety of historical case studies ranging from the early modern period until current times. These included examples from both everyday culture and academic communities, and across a range of disciplines in the humanities, the social and the natural sciences. They looked at, for example: the circulation of facts about how to construct buildings in the early modern period [2]; the use of facts about the behaviour of rats living in crowded conditions for the re-design of college dorms and prisons in the mid-20th century [3]; and the use of statistical models to circulate urgent epidemiological facts from scientists to policy makers in modern epidemics [4]. The considerable scope of these cases of travelling facts underwrote a certain confidence in the conceptual framework, findings and methodological recipes that the LSE academics developed.

An essential part of the research involved developing these methodological recipes for `chasing facts', both back to their production and forward to their users, and thus to pin down how the integrity of facts is maintained as they travel. These fact-chasing methodologies involved innovative forms of citation searches to assess the integrity of circulating knowledge. For example, the joint notions of `listening' and `speaking' citation trees were developed to determine how facts about the Indian Green Revolution travelled in the post-war period between social scientists from different disciplines [5]. A `listening tree' looks backward (in the citation tree) to see whom the authors of reports were listening to, in contrast, a `speaking tree' works forward to see to whom an author is speaking (in the citation tree). Another fact chasing methodology developed a means of checking the specific usage of citations to follow the integrity of facts as they were re-used by others later (in this case about the order of firms exiting an industry). Another analysed how integrity was maintained by a system of labelling used by curators working in bioinformatics, so that researchers could confidently pick up and reuse the scientific work done by others in a new site on parallel projects [6].

Using these examples the team developed means and methods of following particular facts through their circulation to see who used them for what and how. These processes also provided a means for checking the integrity of facts as they travel, and thus provide generic recipes for providing a `quality assurance' regime for scientific facts. They could be extremely valuable for any public or policy body relying on scientific work for its credibility and legitimacy (as was proved in the case of the Dutch government and the IPCC report).

Key Researchers:

The research project ran from 2004-10 with a research team consisting of three LSE faculty members (Mary Morgan, Peter Howlett and Patrick Wallis: all in post at LSE through this period), five post-doctoral fellows (Simona Valeriani, Jon Adams, Edmund Ramsden, Erika Mansnerus, and Sabina Leonelli) and four PhD students (Aashish Velkar, Albane Forestier, Ashley Millar and Julia Mensink). There were several regularly visiting international researchers in the history of science (Alison Wylie, Martina Merz, Rachel Ankeny, Marcel Boumans and Harro Maas). The joint output of the project was an edited volume of many of the case studies along with a framework essay.

References to the research

1) How Well Do Facts Travel? (Edited Mary S. Morgan and Peter Howlett, Cambridge University Press, 2011) LSE Research Online ID: 30128

2) "In the ancient forme. On the reception and `invention' of ancient building techniques in Early Modern Times", by Simona Valeriani in Hephaistos. New Approaches in Classical Archaeology and Related Fields, (2008) special issue, 26-2008, pp.169-188. Available from LSE on request

3) "Escaping the Laboratory: The Rodent Experiments of John B. Calhoun & Their Cultural Influence" by Edmund Ramsden, Jon Adams (2009), Journal of Social History 42: 3, pp. 761-792. LSE Research Online ID: 22514

 

4) "The Lives of `Facts' in Mathematical Models: A Story of Population-level Disease Transmission of Haemophilus Influenzae Type B Bacteria" by Erika Mansnerus (2009), BioSocieties, 4, 207-222. DOI: 10.1017/S1745855209990111

 
 

5) "Travelling in the Social Science Community: Assessing the Impact of the Indian Green Revolution Across Disciplines", Peter Howlett (2008) Working papers in Economic History: How Well Do Facts Travel, no 24. LSE Research Online ID: 22513

6) "Re-Thinking Organisms: The Epistemic Impact of Databases on Model Organism Biology", Sabina Leonelli and Rachel Ankeny Studies in the History and Philosophy of the Biological and Biomedical Sciences, (2012) vol. 43, 29-36 http://www.sciencedirect.com/science/article/pii/S1369848611000793

 

Evidence of Quality:

Short-listed to be amongst the final 5 projects for The Times Higher Education (THE) Research Project of the Year 2008 and judged "Highly Commended".

Key output book published after extensive refereeing by Cambridge University Press, and - exceptionally — has been reviewed in Science (Vol 333, 12 August, 2011, p 824).

The research was funded by a Programme Grant for "The Nature of Evidence: How Well Do Facts Travel?" from the Leverhulme Trust; £751k awarded to Professor Mary S. Morgan and team at Department of Economic History, London School of Economics.

Details of the impact

The travelling facts project provided the Dutch Environmental Assessment Agency (PBL) with a conceptual approach and methodology to check the integrity of scientific `facts' contained in a UN Intergovernmental Panel on Climate Change (IPCC) report [A].

The project has been cited as key literature by Dr Maarten Hajer (2011, reference item [B] below), head of the PBL, in deciding how to react to a "climategate" problem as it arose in the Dutch Parliament in February 2010. A significant error in the IPCC report claimed that 55% of the Netherlands was below sea level. This created a serious political embarrassment for the Dutch Minister for Environment. She asked the Agency to check the IPCC reports for all further errors and to report the findings to her, and to the Dutch parliament.

The Agency's Head of Methodology Dr Arthur Petersen (now Chief Scientist of the Agency), sought advice from Mary Morgan, lead researcher of the Facts project, as to how to go about doing this assessment. On the basis of the Facts research project findings, Morgan suggested that the task was not to establish the truth or falsity of all the climate science facts per se — an impossible task without redoing all the science. Instead, he should undertake a quality assurance check of how the integrity of those scientific facts (that had been established by scientists and passed their peer review systems) was maintained as they then travelled into the final IPCC report. This involved not only checking whether the facts that had been produced by scientists were accurately reported in the final report but, in addition, where judgement had been required, whether such judgements were reasoned and reasonable. So, this strategy involved a methodology of checking whether those facts were accurately reported, the transparency of the process by which those facts found their way into the IPCC report, and an assessment of the expert judgements used in reporting those facts. These three modes worked together to justify the integrity of the facts as they travelled into, and through, successive revisions to the IPCC reports. Morgan's advice (in person and by email comments on the draft report by the agency into the problem) was based on the project research in two respects: on its conceptual understanding of how facts maintain their integrity as they are circulated and on the project's developed methods for tracing scientific facts backwards and forwards.

On the basis of this advice, which was passed on from Dr Petersen to the head of the Agency, Dr Maarten Hajer — the agency set up a process to check the integrity of the facts that had travelled into the final IPCC reports. Under the co-leadership of Dr Petersen, this checking involved 30 staff members over 5 months. They were able to clarify how the specific error about Dutch flooding arose (due to a conflation of two facts into one, which was then reported inaccurately), and to reassure the Dutch government and parliament that there were no errors that affected the IPCC summary conclusions on projected regional climate-change impacts. However, using these methods, they found several failures of integrity relating to the way facts travelled to the summaries, and developed a useful taxonomy of such shortcomings for future analyses (see their 2010 report, referenced item [C] below).

Dr Petersen said: "the interaction with Professor Mary S. Morgan, informed by the results of her LSE Facts project, has been crucially important in my Agency's undertaking of its 2010 assessment of the Intergovernmental Panel on Climate Change (IPCC) Fourth Assessment Report of 2007...How well facts travel, in particular how well scientific advisers can follow — and to some extent trace — an experts group's judgment, has become a pivotal methodological concern in my Agency's system of scientific quality assurance and quality control." [A]

A greater and wider impact of the Facts project lies in the fact that the Dutch Environmental Assessment Agency has undertaken to conduct similar "integrity checking" quality controls — based on the Facts project approach and recipes — for the next set of IPCC reports. As of Summer 2013, it has conducted integrity checks for drafts of two of the Working Group reports (one now in its final form). This should ensure that the errors and shortcomings that might creep in during the process of transferring original scientific findings into final reports are avoided or removed before publication. Given the political and public concern over errors in the last set of IPCC reports, this is likely to offer a significant improvement in accuracy and so credibility for this next round of reports. Their work on these IPCC reports was commended in a recent evaluation of the Agency by an international review committee.

The Facts project showed that the integrity of scientific facts used as evidence in public discourse can be ensured by checking the integrity of facts as they travel from the scientific literature into the public domain, rather than by attempting to assess directly the validity of all the scientific claims made. Since climate science is so important to society, it is equally important that public understanding and argument can rely on the integrity of the evidence in the public domain.

Sources to corroborate the impact

All Sources listed below can also be seen at https://apps.lse.ac.uk/impact/case_study/view/73

(A) The Chief Scientist at the Dutch Agency has said (in a letter dated September 20th, 2013) that the immense task of checking the integrity of the facts as they travelled into the IPCC reports was made feasible by making use of the research findings and research methods of the Facts project as communicated to the Dutch Agency by Morgan and her introductory chapter for the book. The FACTS project was referenced in their report to Parliament and in a methodological Agency Working Paper (2013, referenced as (D) below), and the language of the project appeared in several places in their report. Source file: https://apps.lse.ac.uk/impact/download/file/697

(B) Maarten Hajer, Head of the PBL,: "Inside the Science-Policy Interface" at the Lorentz Center Workshop Error in the Science, October 26th, 2011, both in his talk, and in references in the slides to the Facts project work (eg see slides 21, 24) at: http://www.lorentzcenter.nl/lc/web/2011/460/Abstracts/Hajer.pdf

(C) Reference in the official report: Assessing an IPCC Assessment: An analysis of statements on projected regional impacts in the 2007 report, Netherlands Environmental Assessment Agency, 2010. (See Chapter 1). http://www.pbl.nl/sites/default/files/cms/publicaties/500216002.pdf

(D) Strengers BJ, Meyer LA, Petersen AC, Hajer MA, van Vuuren DP, Janssen PHM (2013), Opening up scientific assessments for policy: the importance of transparency in expert judgements. PBL working paper 14. PBL Netherlands Environmental Assessment Agency, The Hague. http://www.pbl.nl/en/publications/opening-up-scientific-assessments-for-policy-the-importance-of-transparency-in-expert-judgements