Influencing Research Policy Through Policy Relevant Impact Indicators Derived from the Web

Submitting Institution

University of Wolverhampton

Unit of Assessment

Communication, Cultural and Media Studies, Library and Information Management 

Summary Impact Type

Societal

Research Subject Area(s)

Information and Computing Sciences: Artificial Intelligence and Image Processing, Information Systems
Economics: Applied Economics


Download original

PDF

Summary of the impact

The Statistical Cybermetrics Research Group (SCRG) has developed web-based indictors and methods for use in research policy and research evaluation for governmental bodies and non- governmental organisations. The research has impact by providing tools and new types of indicators for policy-relevant evaluations for policy makers and decision makers. The research itself includes (a) the direct production and implementation of new indicators and (b) theoretical research into indicator foundations and tool performance, such as that of the web search engines used for indicator construction. The research has impact on policy making within the United Nations Development Programme by aiding evaluations of its initiatives, and within Oxfam and the BBC World Service Trust. It has impact on policy making at the national and international levels to aid the effective directing of funding to aid knowledge production. It has also has impact on public services by helping Nesta and Jisc to evaluate the success of some of their initiatives.

Underpinning research

The indicator strategy of the group is to: (a) continuously research and develop new web indicators in response to changes in the web and new web services (e.g., Mendeley) and (b) respond quickly to client requirements by tailoring methods to their specific needs. A web indicator, in the narrow sense, is a metric derived from web-based data. The research involves conceiving new metrics and developing methods to capture them, including methods for counting and reporting the number of hyperlinks pointing to a web site (in various ways) and methods to count the number of web sites or other entities mentioning each one of a given set of documents. In the broader sense, web indicators also include graphical representations of web-based phenomena, such as network diagrams of the connectivity of a set of web sites. Below are two illustrative examples.

A majority of the SCRG's research before 2008 was dedicated to developing the theoretical and methodological foundations of link analysis. Two papers are cited below to reference this research direction. As part of this, software was also developed for automatic link analysis, which was then upgraded in the current REF period for use with new types of web indicator.

A series of studies led by Kousha and Thelwall (2008-2011) developed web-based impact metrics for academic journal articles, conference papers and books, and these were subsequently used in projects for Nesta, Jisc, More 2, Lot 2 and the Belgian Government as described in section 3. These are similar to traditional citation metrics, counting how often documents are cited, except that the citations are calculated from the web rather than from traditional citation indexes, such as the Web of Science. The development of the metrics includes evaluations and the construction of filtering methods. For instance, one project demonstrated that it was possible to count citations to journal articles from books using Google Book Search, conducted an evaluation to demonstrate that the results were reasonable (correlating significantly with traditional impact indicators) and gave new and important results (evidence of the impact of humanities-oriented research that was not well reflected in traditional metrics). The automatic method included heuristics to generate queries to identify citations to articles from the article title, author names, and publishing journal title. The methods were incorporated into the Webometric Analyst software to allow them to be used quickly and efficiently on large-scale datasets.

As part of the 2008-2009 Rindicate Contract for the Directorate General of Research (DG Research) in the EU, Thelwall adapted his existing software SocSciBot to crawl and analyse the hyperlinks between the web sites within four specific fields, to draw network diagrams and calculate metrics from the networks. Our research developed an effective way to gather the data and an effective selection of metrics for analysing the data, drawing upon prior webometric research and the field of Social Network Analysis (SNA). We claim that this is a more comprehensive approach to analyse emerging trans-disciplinary fields (the target of the contract) than a traditional scientometric approach relying upon citation analysis and patent analysis because it includes important actors, such as scholarly societies and web portals, that do not publish patents or journal articles. This was used in the Rindicate project described in section 3, as well as in the UN, BBC WST, and Oxfam projects also described below.

The main researchers were Mike Thelwall, Kayvan Kousha and David Wilkinson, with Pardeep Sud conducting follow-on research in the same area, including altmetric research (which we expect to have a measurable impact in 2014), and with Gareth Harries (now retired) having previously conducted relevant research as part of the SCRG. Other co-authors of the section 3 publications provided background information on specific disciplines analysed, comments or help with manual classification and data gathering.

References to the research

Quality evidence: Five of the references are in the top journal of library and information science [see elsewhere in this REF submission for evidence of this], with the one exception being in another highly ranked journal, the Journal of Informetrics. Authors in bold belonged to other institutions. Klitkou, Verbeek and Vincent provided comments and disciplinary expertise on the topics investigated in paper 1, and Rezaie provided help with content analysis in two papers.

1. Thelwall, M., Klitkou, A., Verbeek, A., Stuart, D. & Vincent, C. (2010). Policy-relevant webometrics for individual scientific fields, Journal of the American Society for Information Science and Technology, 61(7) 1464-1475. [Based on Rindicate outputs]

 
 

2. Kousha, K., Thelwall, M. & Rezaie, S. (2010). Using the web for research evaluation: The Integrated Online Impact indicator, Journal of Informetrics, 4(1), 124-135.

 
 

3. Kousha, K., Thelwall, & Rezaie, S. (2011). Assessing the citation impact of books: The role of Google Books, Google Scholar and Scopus. Journal of the American Society for Information Science and Technology, 62(11) 2147-2164.

 
 

4. Thelwall, M. (2008). Quantitative comparisons of search engine results, Journal of the American Society for Information Science and Technology, 59(11), 1702-1710. [This article tests the robustness of search engine data for webometric purposes.]

 
 

5. Thelwall, M. (2006). Interpreting social science link analysis research: A theoretical framework. Journal of the American Society for Information Science and Technology, 57(1), 60-68.

 
 

6. Thelwall, M., & Harries, G. (2004). Do the Web sites of higher rated scholars have significantly more online impact? Journal of the American Society for Information Science and Technology, 55(2), 149-159. [This article summarises the methodological findings of the SCRG's early link analysis research, providing both indicators and methods that were developed into new indicators.]

 
 

Details of the impact

The policy-relevant impact indicators developed by the SCRG have been used to support decision-making in the European Commission Directorate General of Research (DG Research), the UK's National Endowment for Science Technology and the Arts (Nesta), the BBC World Service Trust (BBC WST), the UK's Joint Information Systems Committee (Jisc) and the United Nations Millennium Development Campaign (UNMC), amongst others. As part of the impact of the research, former PhD student Brian Cugelman also used his PhD knowledge to set up a cybermetrics consultancy organisation in Canada (alterspark.com).

In almost all cases the evidence below is for supplying consultancy for decision-making rather than direct evidence of decisions being changed on the basis of the indicators. This is because such decisions are typically conducted by committees in closed sessions on the basis of a variety of types of evidence.

Policy-making (international development): The United Nations commissioned two reports in 2009 into the online impact of their Millennium Campaign from LeitMotiv and the SCRG, with the SCRG employing its web-based indicators. The group also evaluated the United Nations Development Programme (UNDP) knowledge products in 2012. The first consultancy supported the decision- making process to assess the results of the UNMC, a UN Development Programme initiative started in 2002 in order to develop an improved successor. These were two specialist consultancy contributions to a vital decision making process that not only directed $9million per year but also had a wide influence (e.g., 173 million people participated in the Millennium Campaign "Stand up against Poverty" event in 2009) and in addition tackled a critical issue for humanity: poverty. Evidence: Two Millennium Campaign reports by LeitMotiv/SCRG on the UNDP website; One UNDP report. [refs 1-3]

Policy-making (international development): Oxfam commissioned an impact report from former Wolverhampton researcher Cugelman using skills developed as a PhD student in the SCRG. This evaluation supported attempts by Oxfam, a large and influential organisation, to have a positive impact on the climate change debate: a critical issue for humanity. Evidence: Quote from Oxfam. [refs 4, 5]

Policy-making (evaluating European policies for directing knowledge): The European Commission Directorate General of Research commissioned a study (Rindicate 2008-9) of five new scientific areas to help identify promising areas for future funding. In addition, the Directorate General of Research awarded the More 2 project and the Framework Contract on Research Evaluation and Research policy Analysis (Lot 2) to a consortium including the SCRG. Evidence: The funding of the project following the previous similar projects by the same source (RESCAR 2006-2007, NetReAct 2005-2006) is evidence of the Directorate General Research belief in the validity of our approach, although specific policy impacts are not known. The impact of all of these consultancies is that the European Commission is better able to evaluate emerging scientific fields and help direct the billion euro EU research funding budget, for effective EU knowledge development. [refs 6, 7]

Public services (promoting innovation in the UK): The government-funded independent charity Nesta commissioned twice-yearly reports on the impact of their publications from March 2008 to September 2011. Nesta produces about 20 publications each year that are designed to promote innovation in the UK. These are distributed free online and in printed format. Nesta used to track the influence of these publications by counting mentions in the media of them but then asked the SCRG to use its indicators to estimate the online impact of its publications and the impact of its web site. The advantage of web indicators is that they give wider evidence of influence than just press coverage. The results were used by Nesta to help monitor the influence of their reports to help decide which types of reports are best to produce in the future. For example, one type of report was found to always have little impact and its production was stopped, saving Nesta's resources. The ultimate impact of this was the improved ability for Nesta to carry out its mission of promoting innovation in the UK, which can potentially impact economic prosperity, civil society and cultural life in the UK. Evidence: Nesta commissioned 8 reports, indicating that they had enough value for repeat commissioning. [ref 8]

Public services (supporting the UK academic infrastructure): Jisc requires its funded digitisation projects to self-evaluate using the toolkit developed by Oxford University that includes web indicators and tools (the Webometric Analyst software) developed by the SCRG (the Toolkit for the Impact of Digitised Scholarly Resources, TIDSR, developed in 2009). As a result, Jisc has gained more control over the digitisation projects that it funds. Evidence: the TIDSR toolkit is still compulsory for funded projects, testifying to JISC's belief in its value (see reference) [refs 9, 10]

Public services: The BBC World Service Trust commissioned web impact analyses to demonstrate the impact of its Persian news initiative online in Iran in 2009 for a report into its impact. Evidence: a paper first authored by a BBC WST manager describing the results (see references) [ref 11]

Policy-making (evaluating Belgian policies for directing knowledge): A Belgian government-commissioned assessment of its music research included a comparative web impact analysis of Belgian music researchers with other leading academic music centres around the world, conducted by the SCRG. Idea Consult, Belgium authored the report. Evidence: [ref 12]

Sources to corroborate the impact

These references primarily provide evidence that contract research was undertaken, although some provide specific evaluations of the contracted research.

  1. [Evidence that the UNMC evaluation in conjunction with LeitMotiv was conducted] Otero, E. & Cugelman, B. (2009) UN Millennium Campaign: External evaluation 2009. United Nations Millennium Campaign, Leitmotiv and the SCRG.
    http://erc.undp.org/evaluationadmin/downloaddocument.html?docid=2822 Feedback from Salil Shetty, Director of the United Nations Millennium Campaign, "This is not just another evaluation report, it's a must read. You can zoom in on the Executive Summary or the annex listing the bibliography or indeed any other part of the multiple documents; they are all equally pulsating. As you will find in the evaluation, this is relatively uncharted territory and we need you to read the documents and give us your ideas, advice and provocation." http://www.alterspark.com/clients/testimonials
  2. [Evidence that the UNMC evaluation in conjunction with LeitMotiv was conducted] Cugelman, B. & Otero, E. (2009) UN Millennium Campaign: United States evaluation. United Nations Millennium Campaign, Leitmotiv and the SCRG.
    http://erc.undp.org/evaluationadmin/downloaddocument.html?docid=3234
  3. [Evidence that the UNDP evaluation in conjunction with AlterSpark was conducted] Cugelman, B. Thelwall, M. & Buré, C. (2012) Cybermetric Analysis of United Nations Development Programme Knowledge Products and Platforms — Latin America and the Caribbean programme. AlterSpark & the SCRG.
  4. [Evidence that the Oxfam report exists] Oxfam Management Response to the independent Evaluation of Oxfam GB's Climate Change Campaign
    http://www.alterspark.com/uploads/Oxfam-evaluation-management-response.pdf
  5. [Evidence of the value of a consultancy to Oxfam with techniques developed by the SCRG] "Oxfam set out to `raise the bar' in the complex field of advocacy evaluation through commissioning a comprehensive, evidence-based, independent assessment of its climate change campaign. We were fortunate to find an evaluation team capable of providing this [and we were] confident in their findings" http://www.alterspark.com/uploads/Oxfam-evaluation-management-response.pdf
  6. [Evidence that the DG Research project Rindicate existed] "The use of webometrics for the analysis of knowledge flows within the European Research Area". In relation to (DG-RTD- 2005-M-02-01): "Multiple Framework Service Contract for Expert Support with the Production and Analysis of R&D Policy Indicators" together with IDEA CONSULT (coordinator), NIFU STEP, and SPRU. http://ec.europa.eu/invest-in-research/pdf/download_en/spa6_final_report.pdf
  7. [Evidence that More 2 exists and included the University of Wolverhampton]
    http://www.more-2.eu/www/index.php
  8. [Evidence of the value of the NESTA reports] On-going commissioning of reports by NESTA, as reported in the funding received.
  9. [Evidence that TIDSR exists] Digitised Resources: A Usage and Impact Study - Jisc-funded project, consultants to the Oxford Internet Institute. Toolkit online at:
    http://microsites.oii.ox.ac.uk/tidsr/
  10. [Evidence of the use of TIDSR] "The TIDSR was used by projects in the JISC Impact and embedding of digitised resources programme, of which the Old Bailey was one, to conduct an analysis of their collections, identify where resources were working well and what could be done to improve them and better embed their content within teaching and research." Paola Marchionni, http://www.jisc.ac.uk/blog/impact/ and
    http://www.jisc.ac.uk/media/documents/programmes/digitisation/Impact_Synthesis%20report_FINAL.pdf
  11. [Evidence that the web indicators in the BBC WST report were useful] Godfrey, A., Enayat, M., & Thelwall, M. (2008). Generating new media and new participation in Iran: The case of Zigzag, International Association for Media and Communication Research (IAMCR), Stockholm, Sweden, July 20-25. [The first author is from the BBC WST]
  12. Idea Consult can be contacted to verify the SCRG's contribution to the music evaluation report.