Shaping National and International Research Assessment Policy and Practice

Submitting Institution

University of Oxford

Unit of Assessment

Education

Summary Impact Type

Cultural

Research Subject Area(s)

Education: Curriculum and Pedagogy, Specialist Studies In Education
Studies In Human Society: Policy and Administration


Download original

PDF

Summary of the impact

Research on the assessment and governance of research, carried out at Oxford since 2004, has contributed to changes in conceptions of quality and impact underpinning research assessment systems in the UK, Australia, New Zealand and Hong Kong, and substantially influenced strategic action in funding bodies, professional societies, and higher education institutions, nationally and internationally. Oancea and Furlong's work on quality in applied and practice-based research contributed to a more inclusive definition of applied research in the 2008 Research Assessment Exercise and to the Economic and Social Research Council's interpretation of the "excellence with impact" agenda for the social sciences. It has also been used as the basis for assessment criteria for postgraduate programmes, professional development, and practitioner courses in many institutions. Oancea's recent research was used in strategic documents and resources by, for example, the Social Care Institute for Excellence (SCIE), the Strategic Forum for Research in Education (SFRE), and the Universities Council for the Education of Teachers (UCET).

Underpinning research

Since 2004, a programme of empirical and philosophical research has explored current and emerging challenges in higher education governance, research policy, knowledge creation and mobilisation, and research assessment. A core component of this programme has been work on the assessment of research quality and research impact. This research enabled the development of comparative (cross-disciplinary and international) perspectives on strategies and methods for research assessment, their contexts, and their affordances and limitations [for references see section 3: R3, R4). Two innovative frameworks for understanding and capturing the quality and impact of research developed through this work have been recognised as significant contributions to academic research on governance and assessment [R1, R2, R5].

First, a framework for understanding excellence in applied and practice-based research proposed a more holistic concept of quality than previously used in assessment mechanisms (including the RAE). The framework consists of three "domains of excellence" for research: epistemic (including e.g. methodological rigour and paradigmatic acceptability); technical (e.g. fitness for purpose, concern for impact, feasibility, efficiency, value for money); and "phronetic", or practical (capturing the organic relationship of research with professional practice). The framework arose from an ESRC-funded empirical and philosophical study [R1].

Second, empirical and theoretical work on research impact [R2, R5] generated a textured conceptualisation of impact and of the relationship between the different communities involved in impact processes, as well as a detailed assessment of the impact indicators currently used in the UK, e.g. in preparations for the REF. The realisation of impact is conceptualised as layered, from the connectedness of research with partners, through its visibility to a range of audiences, its use, application and exploitation, and to its wider benefits and societal and cultural diffusion. The findings have been translated into practical tools (impact capture methods and training resources).

The conceptual and comparative work set the ground for more applied research, such as the mixed-method evaluation of the impacts of the 2008 RAE on education departments, teams and staff [R6]; and the review of the implications of the recent, shifting policy and economic contexts for the future of educational research (in terms of infrastructure, capacity, income, and of the relationship between research and teacher education) [R7].

The programme of research is currently led by Dr Alis Oancea (University Lecturer, employed by Oxford University since 2004) and supported by current Higher Education Innovation Fund - HEIF (2013-15) and AHRC awards (2013-14). Key contributions include those by Prof John Furlong (Professor of Education, since 2003) and Prof Ian Menter (Professor of Teacher Education, since 1 April 2012) on research quality and on the current state of educational research.

References to the research

[R1] Furlong, J. and Oancea A. (2007) Assessing Quality in Applied and Practice-Based Research in Education. London: Routledge.
• edited book based on ESRC research (RES-618-25-6001). Includes highly-cited Oancea and Furlong paper on the quality framework, submitted to RAE 2008 (first draft published by the ESRC in 2005 and described as "groundbreaking" and "highly influential" - Groundwater-Smith & Mockler, 2009). Initially published as special journal issue of Research Papers in Education. Follow-up and review papers include: Karran, 2009; Hammersley, 2008; Carr, 2008.
[R2] Ovseiko, P.V., Oancea, A., and Buchan, A.M. (2012) Assessing research impact in academic clinical medicine: A study using Research Excellence Framework pilot impact indicators. In: BMC Health Services Research, 12 (478), 1-23.
• peer reviewed, ISI-indexed gold open access journal article, awarded highly accessed status in its first two weeks, with over 6,000 downloads since December 23. Ranked in the global 99th percentile in all fields in social media coverage measured by Altmetrics.
[R3] Oancea A. (2005) Criticisms of educational research: Key topics and levels of analysis. In: British Educational Research Journal (BERJ), 31 (2), 157-83.
• peer reviewed ISI-indexed (among BERJ most cited in 2009), submitted to RAE 2008.
[R4] Oancea A. (2007) From Procrustes to Proteus: Trends and practices in the assessment of education research. In: International Journal for Research Methods in Education, 30(3), 243-69.
• peer reviewed article, submitted to RAE 2008.
[R5] Oancea, A. (2013) Interpretations of research impact in seven disciplines. European Educational Research Journal, 12(2), 242-50.
• peer-reviewed article, draws on full 2010-11 research report.
[R6] Oancea, A. (2010) The BERA / UCET Review of the Impacts of RAE 2008 on Education Research in UK Higher Education Institutions. Research report. Macclesfield: UCET/BERA.
• report of empirical study funded and disseminated by BERA and UCET.
[R7] Whitty, G., Donoghue, M., Christie, D., Kirk, G., Menter, I., McNamara, O., Moss, G., Oancea, A., Rogers, C. & Thomson, P. (2012) Prospects for the Future of Educational Research. London: BERA/ UCET. (Oancea was the commissioned researcher).

 

Research funding:
Research on quality, impact and assessment [R1, R4-7] was funded by the ESRC (2004-05), UCET (2009-12), BERA (2009-12), and from HEIF investment (2010-11) and donor funding (2006-09), all at Oxford University. The work on indicators [R2] was funded by HEIF and from a National Institute for Health Research award held by Buchan (Medical Sciences, Oxford University).

Details of the impact

Contribution to re-framing national and international systems for research assessment:

Furlong and Oancea's work on the quality of applied and practice-based research was referenced and explicitly drawn upon in the working methods and criteria statement of the RAE 2008 Education sub-panel. The Working Criteria for UOA45 state: "the sub-panel adopts Furlong and Oancea's definition of applied and practice-based research (definition quoted)" [see in section 5: C1, para 19]. The Chair of the 2008 RAE sub-panel describes the use and influence of this work as follows: "After the panel all read the document, we discussed it and as a result decided to adopt many of the conceptual distinctions and recommendations to help us define our research field and our attitude to assessing research from different traditions, in particular our valuing of practice-based research. It was the only publication referenced in our document on Criteria and Ways of Working, which outlined our intended approach... We made use of the report in agreeing on assessment judgements in the areas of environment and esteem as well as in assessing outputs. Thus the publication made a significant difference to assessment judgements, including encouraging the agreement to a different set of values from that used in 2001 in order to give greater respect to applied research. These judgements in turn of course affected research funding of education departments from 2009 onwards, favouring those with a greater proportion of applied research rather more than previously. The report was also circulated by the RAE organisers to members of more than ten other panels... It was the only publication used at a meeting of these members called by the 2008 RAE team to ensure that judgements from different panels in this area were comparable and used the same conceptual and value framework." [C2]

Internationally, the research has contributed to re-framing debates about research assessment systems (e.g. in New Zealand, Australia, Hong Kong) and to stronger recognition of the value of applied and practice-based research in education and other social sciences and humanities. In New Zealand, the 2009 review of the national assessment system, the Performance-Based Research Fund (for the redesign of its 2012 round), referenced this work as a "basis for more extended debate of research assessment" [C3, p.6]. The consultation paper endorsed the argument that a shift in how research assessment is framed is required, rather than minor technical tweaks: "While some issues may be alleviated by comparatively minor changes to guidelines and assessment practices, the major issues demand the kind of robust national debate called for by Furlong and Oancea... Such a debate could involve professional and industrial bodies as well as academics and could well be initiated by the TEC (Tertiary Education Commission)" [C3, p.14].

In addition, the research has been referenced in official reports and led to policy and practice-oriented events and publications in Australia, New Zealand, Hong Kong, Canada, and Sweden. For example, the Social Sciences and Humanities Research Council of Canada repeatedly references Oancea and Furlong's revised quality framework [R1] as one of the few contributions to cutting-edge efforts "to developing the theoretical and philosophical dimensions of research assessment" [C4, p. 28]. In Europe, Oancea applied the impact and quality frameworks as invited critical friend (2009-11) to the EC FP7 European Educational Research Quality Indicators project, which included 6 European countries and several industry partners. Furlong's research expertise led to his current appointments as Convenor of the Education panel for the Hong Kong 2014 RAE and as member of the International Panel of Experts for the Social Sciences for the 2013 Latvian RAE.

Contribution to stakeholders' understanding of, and strategies for, quality and impact:

The research [R1] was influential in the ESRC; in a previous ESRC Chief Executive's words, "following its publication in 2004 Furlong and Oancea's work played a key role in ESRC's thinking for at least the rest of the decade...The framework which came directly from Furlong and Oancea's work had applications beyond education research into cognate disciplines where we desire that professional practice is informed by world class research; and longevity in that it will be relevant for many years to come. It is, in my view, a truly world class example of research impact." [C5]

Early dissemination included the ESRC's production and distribution of around 2000 gloss-print briefings on the project findings, together with suggestions for tailored use by evaluation agencies. Furlong and Oancea were invited to brief the ESRC Chief Executive personally, resulting in commissioned follow-up work in 2005. In evidence to the House of Commons Science and Technology Committee on the work of the ESRC, the Chief Executive described the "excellent" 2004 draft Furlong and Oancea report as the ESRC's way of addressing the problems arising from the fact that "research related to professional practice has been said not to be properly reflected in the research assessment exercise which looks at academia" and recommended that the RAE 2008 panels in education and in other disciplines draw upon it explicitly in their guidelines to ensure that applied research and non-standard outputs are judged in their own terms (Hansard, 24/10/ 2004, Q78). The framework was referenced in the 2006 ESRC demographic review of the social sciences and featured highly in a 2005 ESRC seminar series on research quality. It was recommended for use in social work and social care by the 2007 ESRC/Social Care Institute for Excellence review of these fields, published in article form in 2008 [C6], and also referenced in the 2009 ESRC Strategic Adviser for Social Work and Social Care report. Reporting on the ESRC/SCIE review, Shaw and Norton (2008) state: "We think the framework developed by Furlong and Oancea (2005) will serve with some modification for other applied social sciences, including social work" [C6, p.967]. These interactions led to impact accumulated over time, post-2008. Also recently, the Chair of the ESRC Research Evaluation Committee was briefed personally by Oancea on findings from the 2010-11 study on impact [R5]. According to email communication (2011/12), the "hugely interesting" report was taken forward to the ESRC evaluation team.

Between 2008 and 2010, the report [R1] was a core document for the understanding of quality in the multiple-stakeholder Strategic Forum for Research in Education (SFRE), funded by the ESRC, BERA, DfE, and CfBT [C7]. Speaking about BERA and SFRE, the BERA President commented in her address that "BERA needs to be at the forefront of these debates, extending and developing the work of Furlong and Oancea on assessing quality in applied and practice-based research" (Munn, 2008). In 2009, Oancea was further commissioned by the TLRP to do a review on quality criteria and impact, and became a member of SFRE's Planning Group [C7].

The work has been used in policy-initiated evaluations of research, such as the evaluation of the Applied Educational Research Scheme in Scotland (Scottish Government, 2008). The 2005 version of Furlong and Oancea was also included on the selected list of useful publications on evaluating policy recommended by Policy Hub, the website of the Government Social Research Unit (2008). Oancea's broader work on research policy and practice (since 2004) was central to shaping the direction of policy campaigns by the British Educational Research Association and Universities' Council for the Education of Teachers in 2004-05 and in 2012-13. The 2010 report by Oancea [R6] was the basis for BERA and UCET's responses to the REF 2014 consultations. In addition, BERA and UCET ran five targeted sessions on the findings from 2010 and a subsequent 2012 joint report, two for heads of department and three for directors of research from over 50 HEIs. Participants to these events remarked on the direct practical relevance of the findings to strategic planning for their departments in the period 2010-2014 [C8].

Use in postgraduate training and researcher development practice:

The quality framework has been used in postgraduate degree specification and training courses across the UK and beyond. For example, the programme specification for the EdD (professional doctorate) offered by Exeter University includes the Furlong and Oancea framework as one of two documents used in lieu of "doctoral level benchmarks in Education" [C9], while the staff CPD Academic Practice programme in Northumbria University (2009) recommends it as criteria for participants' own work. The framework has been used as quality standard in MSc and EdD theses by teacher researchers (e.g. Spiro, 2008) and applied in PhD theses in education and social work (e.g. Cockerill, 2012, Jarvis, 2011). Best-selling research methods textbooks in the social sciences (e.g. Cohen, Manion and Morrison, 2011; Shaw et al, 2009) and methods textbooks in education (Scott and Usher, 2011; McNiff and Whitehead, 2009) also use the Furlong and Oancea framework in their discussion of research quality. They draw strongly on it in their companion materials, including teaching materials made available online. A video for students and new researchers across the range of social sciences, drawing on the research programme, was produced and disseminated by SAGE: Oancea, A. (2010) Quality of research: How do I know if my research findings are any good? (SAGE). The video is available from SAGE Research Methods Online.

The work has been used by practitioners and practitioner researchers in education and other fields to develop critical reflection on the quality and use of research. Examples from education include sharing the work with the National Teacher Research Panel (2010) and with practitioner stakeholders through the Strategic Forum for Research in Education (2008-10). The Social Care Institute for Excellence used insights from this work to shape the definition of research given in their 2012 online resource for the research-based professional development of social care practitioners [C10]. RAND Europe is using the impact indicators study in presentations to impact assessment policy and practice communities (e.g. the DESCRIBE project, 2013) [C11]. Details of engagement and dissemination activities enabling these impacts are held on file.

Sources to corroborate the impact

[C1] RAE 2008 Main Panel K-Panel criteria and working methods.
http://www.rae.ac.uk/pubs/2006/01/docs/kall.pdf

[C2] Chair, UK RAE 2008 Education sub-panel. Letter on file.

[C3] PBRF (2009) Performance-Based Research Fund: Sector Reference Group Review:
Evaluating applied and practice-based research
. New Zealand: Tertiary Education Commission.
http://www.tec.govt.nz/Documents/Reports%20and%20other%20documents/pbrf-pa-research.pdf

[C4] SSHCR (2008) Review and conceptualization of impacts of research/creation in the fine arts. Final Report. Canada, Sept. 2008. http://www.sshrc-crsh.gc.ca/about-au_sujet/publications/RC_fine_artsFinalE.pdf

[C5] ESRC Chief Executive (2003-10) & Chair of RCUK Executive Group (2004-09). Letter on file.

[C6] Shaw, I.& Norton, M. (2008) Kinds and quality of social work research. British Journal of Social Work, 38, 953-70 (ESRC review findings and recommendations).

[C7] About SFRE. Website of Strategic Forum for Research in Education, http://www.sfre.ac.uk

[C8] Executive Director, Universities Council for the Education of Teachers.

[C9] EdD Programme Specification, University of Exeter, June 2011. Copy on file.

[C10] Social Care Institute for Excellence (2012) Research Mindedness. Professional development resource for social care practitioners. http://www.scie.org.uk/publications/researchmindedness/whyrm/whatisresearch/index.asp

[C11] Morgan, M. & Grant, J. (2013) Making the Grade: Methodologies for assessing and evidencing research impact. In: Dean et al (eds) 7 Essays on Impact. DESCRIBE Report for JISC.