The impact of the National Student Survey
Submitting InstitutionOpen University
Unit of AssessmentEducation
Summary Impact TypeSocietal
Research Subject Area(s)
Education: Specialist Studies In Education
Summary of the impact
Open University (OU) researchers were responsible for the development of
the National Student Survey (NSS). It is an influential and widely cited
source of information about the experience of students in higher
education. Around 287,000 students at more than 300 institutions responded
to the 2012 NSS. It has been incorporated into the league tables published
annually by The Times, Sunday Times, Guardian and
the online Complete University Guide. Performance in the survey
has led institutions to take actions and initiatives to improve the
student experience. The Ramsden report for the Higher Education Funding
Council for England (HEFCE) indicates it has become an important element
in quality enhancement.
In 2000, HEFCE agreed that the sector would publish key data to enable
prospective students to make more informed judgements on where to study.
Because of concerns about the adequacy of existing data, HEFCE
commissioned the project Collecting and using student feedback on
quality and standards of learning and teaching in HE. This was
carried out by a joint project team consisting of members of academic
staff from the OU, staff from SQW Limited and members of the NOP (National
Opinion Polls) Research Group.
Working collaboratively across the different areas of the project, the
team aimed to:
- identify good practice in obtaining student feedback
- make recommendations to institutions concerning the design and
implementation of feedback mechanisms
- make recommendations on the design and implementation of a national
survey of recent graduates, the results of which would be published to
assist future applicants to higher education.
A number of outputs resulted from this work, including a literature
review on ways of obtaining student feedback (Richardson, 2005). The
project's main finding showed that it would be feasible to introduce a
uniform national survey to obtain feedback from all recent graduates about
their programmes of study (Brennan et al., 2003).
HEFCE then commissioned a pilot study to explore the implementation and
value of such a survey. This was carried out during 2003 by the OU
researchers. The results suggested that it was possible to design a short,
robust instrument that would measure different aspects of the quality of
the student experience. However, the timing of this survey was thought not
to be optimal because the results would only inform students seeking to
enter university two years later. HEFCE resolved to address this and other
issues by exploring the idea of a national survey of final year
undergraduate students. The same OU team was commissioned to undertake
another pilot study early in 2004 to investigate the feasibility of such a
survey. The results confirmed its feasibility, and HEFCE resolved to
proceed with a full NSS early in 2005 and annually thereafter (Richardson
et al., 2007).
Later work by Locke et al. (2008), at the OU, carried out in
collaboration with staff from Hobsons plc, demonstrated that league tables
which incorporate the NSS data in their ranking methodologies had a major
impact on institutions' strategic planning. Ashby et al. (2011), also at
the OU, extended the earlier work by examining anomalous patterns of
responses that affected NSS ratings obtained from distance-learning
John Brennan Professor of Higher Education Research (since 2011
Emeritus); William Locke, Principal Policy Analyst (2006-10); John
Richardson, Professor in Student Learning and Assessment; John Slater, NSS
Coordinator (2003-06); Ruth Williams, Principal Policy Analyst, (until
References to the research
Ashby, A.N., Richardson, J.T.E. and Woodley, A. (2011) `National student
feedback surveys in distance education: an investigation at the UK Open
University', Open Learning, vol. 26, pp. 5-25 [Online].
Brennan, J., Brighton, R., Moon, N., Richardson, J., Rindl, J. and
Williams, R. (2003) Collecting and using student feedback on quality
and standards of learning and teaching in HE (Report RD08_03)
[Online], Bristol, Higher Education Funding Council for England. Available
Locke, W., Verbik, L., Richardson, J.T.E. and King, R. (2008) Counting
what is measured or measuring what counts? — League tables and their
impact on higher education institutions in England (Publication
2008/14) [Online], Bristol, Higher Education Funding Council for England.
Available at http://www.hefce.ac.uk/pubs/hefce/2008/08_14/.
Richardson, J.T.E. (2005) `Instruments for obtaining student feedback: a
review of the literature', Assessment and Evaluation in Higher
Education, vol. 30, no. 4, pp. 387-415 [Online].
Richardson, J.T.E., Slater, J.B. and Wilson, J. (2007) `The National
Student Survey: development, findings and implications', Studies in
Higher Education, vol. 32, pp. 557-80 [Online].
All journals named above employ an anonymised peer review process.
2002: £98,750 awarded by the Higher Education Funding Council for England
to Professor John Brennan, The Open University, for a project entitled
`Collecting and using student feedback on quality and standards of
learning and teaching in higher education'.
2003-04: £371,395 awarded by the Higher Education Funding Council for
England to Professor John Slater, The Open University, for a project
concerned with pilot studies towards the development of a National Student
2007-08: £33,600 awarded by the Higher Education Funding Council for
England to Mr William Locke, The Open University, for a project entitled
`League tables and their impact on higher education institutions in
Details of the impact
The survey encompasses final year students in England, Wales and Northern
Ireland funded by HEFCE, the Higher Education Funding Council for Wales
and the Department for Employment and Learning in Northern Ireland. Most
Scottish universities have opted to join the NSS, as has the private
University of Buckingham. In total 287,000 students responded to the 2012
survey, an increase of 20,000 on the previous year. This has produced key
indicators of the quality of teaching, learning and student experience
within the four major league tables of UK institutions, published annually
by The Times, Sunday Times, Guardian and the
online Complete University Guide. Positive NSS ratings are often
highlighted on institutions' websites.
The central role of our research in underpinning the development of the
NSS has been confirmed by Heather Fry (Director of Education,
Participation and Students, HEFCE). She indicates that `The Open
University, most notably John Richardson, John Brennan and John Slater,
was heavily involved in developing the NSS. In particular the OU team
listed made a significant contribution to the development of the NSS
including timing and evaluation of the survey instrument ...' In her view,
it remains `a robust tool for enabling comparisons between courses in the
same subject and has remained largely unchanged to date', although it is
now entering a review period. In her view, the introduction of the NSS
`has had an impact in two main areas; student choice and institutional
A report for HEFCE by Ramsden et al. (2010) indicates: `The NSS forms
part of the national Quality Assurance Framework (QAF) for higher
education ... Although the NSS was originally conceived primarily as a way
of helping potential students make informed choices, the significance of
the data it collects means that it has become an important element in
quality assurance (QA) processes and in institutional quality enhancement
(QE) activities related to the student learning experience ... We found
striking the emphasis that institutional managers placed on the way the
NSS findings allowed them to identify potential problems in the student
experience, and to act on them quickly' (p. 3).
The authors noted that the similarity between the NSS and the Australian
Course Experience Questionnaire (CEQ) was due to the influence of the
original report by Brennan et al. Collecting and using student
feedback on quality and standards of learning and teaching in HE (p.
26). The 2010 study included interviews with a variety of stakeholders,
and the authors listed a variety of ways in which respondents had used NSS
results for quality enhancement purposes (p. 41).
The National Union of Students reports that the NSS has encouraged
institutions to take student opinion more seriously and has campaigned to
encourage institutions to improve their ratings in the area of assessment
The Union provided the inquiry by Ramsden et al. (2010, pp. 84-88) with
case studies from 11 institutions to illustrate how student unions had
used NSS results to campaign for improvements in areas such as feedback on
assessment, personal tutoring, library facilities and student
Use of NSS results by the Higher Education Academy (HEA)
The HEA supports institutions in using NSS results to enhance the quality
of the student experience (http://www.heacademy.ac.uk/resources/detail/ipp/Issue5-NSS).
The HEA has sponsored investigations of issues arising from NSS results in
particular subject areas, such as art and design (Vaughan and Yorke, 2009)
and social work and social policy (Crawford et al., 2010). A recent report
for the HEA by Buckley (2012) described the pivotal role of NSS data in
institutional quality enhancement and its impact on staff and students.
New HEA-funded research (2013) by Pegg at the OU is examining cases of
cross-institutional curriculum change and has identified NSS results as
one of the key drivers for change.
Institutional case studies
NSS results have prompted institutions to implement initiatives aimed at
enhancing the student experience, especially with regard to assessment and
feedback. Published accounts include Sheffield Hallam University (Flint et
al., 2009), Oxford Brookes University (Handley and Williams, 2011), Leeds
Metropolitan University (Brown, 2011) and the University of Reading (Crook
et al., 2012). Most of these initiatives provided evidence of changes in
teachers' behaviour or changes in students' expectations and behaviour.
Others provide evidence of changes in institutional policies or strategy,
for example the University of Exeter has linked its strategic plan to
future NSS results (http://www.exeter.ac.uk/about/vision/strategicplan/delivering/).
Ramsden et al. (2010, p. 57) recommended that a version of the NSS should
be introduced for postgraduate taught programmes. In fact, the HEA has
been running a Postgraduate Taught Experience Survey, modelled on the NSS,
since 2009. Data derived from the NSS have become one of the key sources
of information in the course comparison website, Unistats, which was
developed by HEFCE to provide information to prospective students and
In becoming such a key indicator of the student experience and in
achieving such significant visibility, the NSS has changed the behaviour
of institutions, their teachers and their students.
Sources to corroborate the impact
- Director of Education, Participation and Students, HEFCE.
- Brown, S. (2011) `Bringing about positive change in the higher
education student experience: a case study', Quality Assurance in
Education, vol. 19, no. 3, pp. 195-207 [Online].
- Buckley, A. (2012) Making It Count: Reflecting on the National
Student Survey in the Process of Enhancement [Online], York,
Higher Education Academy. Available at:
- Crawford, K., Hagyard, A. and Saunders, G. (2010) Creative
Analysis of NSS Data and Collaborative Research to Inform Good
Practice in Assessment Feedback [Online], Southampton, Higher
Education Academy Subject Centre for Social Policy and Social Work.
Available at http://www.swap.ac.uk/docs/projects/nss_report.pdf.
- Crook, A., Machline, A., Maw, S., Lawson, C., Drinkwater, R.,
Lundqvist, K., Orsmond, P., Gomez, S. and Park, J. (2012) `The use of
video technology for providing feedback to students: can it enhance the
feedback experience for staff and students?', Computers and
Education, vol. 58, no. 1, pp. 386-96 [Online].
- Flint, A., Oxley, A., Helm, P. and Bradley, S. (2009) `Preparing for
success: one institution's aspirational and student focused response to
the National Student Survey', Teaching in Higher Education, vol.
14, no. 6, pp. 607-18 [Online]. DOI:10.1080/13562510903315035.
- Pegg, A. (2013) "We think that's the future": curriculum reform
initiatives in higher education, York, Higher Education Academy.
- Pokorny, H. and Pickford, P. (2010) `Complexity, cues and
relationships: Student perceptions of feedback', Active Learning in
Higher Education, vol. 11, no. 1, pp. 21-30 [Online].
- Ramsden, P., Batchelor, D., Peacock, A., Temple, P. and Watson, D.
(2010) Enhancing and Developing the National Student Survey
[Online], Bristol, Higher Education Funding Council for England.
- Vaughan, D. and Yorke, M. (2009) `I can't believe it's not
better': The Paradox of NSS Scores for Art & Design [Online],
Brighton, Higher Education Academy Subject Centre for Art, Design and
Media. Available at: