Thinking on research impact, meta research, and whatever else is on my mind. Subscribe to get future posts by email (or use the rss feed).

Home

Impact learning lessons from the REF

This post was originally published on the HEFCE blog on 10 November 2015 (archived copy). Interestingly, since then I have added another four lessons to my talks about research impact, so maybe I should update this post at some point.

The case for public investment in research depends on the notion that society broadly benefits from the research and researchers that are funded.

This is why the idea of impact has become so central to research funding and policy in the UK. An obvious manifestation of this policy focus was the inclusion of the impact element in the most recent Research Excellence Framework.

As a new element of the exercise, HEFCE, working with other funders, has commissioned a considerable amount of work to investigate impact in the REF. There has been an evaluation of the impact element itself, carried out by RAND Europe looking at how universities prepared case studies of impact and at the process of panel assessment. In addition, research by the Policy Institute and Digital Humanities departments at King’s College London, and Digital Science, has examined the case studies themselves. The case studies have also been published as a searchable database.

There is a huge amount of information and richness in all this research. But out of this work I distil out six key lessons, about research impact and its assessment.

Lesson 1: Preparing for the impact element of the REF has given strategic insight and other benefits to universities. One of the reasons for a rounded assessment of research excellence that includes broader societal impact is to incentivise university and researchers to increase their focus on delivering benefits from research. The impact evaluation provides evidence that this objective is being met. The inclusion of the impact element in REF, alongside other changes – such as the requirement to consider broader impact in Research Council applications – is leading to a change in culture. For example, one of the interviewees for the evaluation said:

‘I noticed my perception of research changing slightly and my passion to make an impact with my research enhanced; this was due to constant in-depth thinking about what we (and I) do in the unit and why we do it. I can say that I became totally immersed in the topic of impact and became fascinated by the area.’

Lesson 2: The assessment of impact worked well, but there are areas for improvement. A national-scale assessment of broader impact in the REF was a world-first. According to the independent evaluation of the assessment process, it worked well. The majority of panel members felt that they were able to make fair, reliable and robust judgements. A central part of the process was the inclusion of panel members and assessors from outside academia – the so-called users of research. As one panellist commented:

‘It was a stroke of genius to get people [academics and users] together to get that consensus generated.’

This is not to say that there aren’t opportunities to improve further the process. Issues were identified about definitions and the relationship with underpinning research, how evidence was provided and used. Some panels felt that they could have graded impact on a more granular scale. All food for thought for the future.

Lesson 3: Considerable and diverse forms of impact were submitted for assessment. The analysis of the impact case studies themselves demonstrates the huge range of benefits that research in higher education brings to society. The team at King’s College and Digital Science were able to identify 60 distinct impact topics, or types of impact. It is hard to find a facet of the world – or indeed a country in the world – that doesn’t feature somewhere in the impact case studies. Commercial benefits, health gains, improved public policy, a more sustainable environment, and cultural enrichment: all feature prominently.

Lesson 4: Impact derives from the integration of disciplinary knowledge. Analysis of the case studies also reveals that a diverse array of research disciplines contribute to delivering societal benefit. For the majority of case studies the underpinning research was drawn from at least two distinct disciplines, and in two-thirds of the cases at least two of the disciplines were substantially different from one another. This highlights the importance of research crossing disciplinary boundaries. This research not only enhances knowledge but can also help to solve societal problems, thus delivering impact.

Lesson 5: Researchers who deliver high-quality academic research also deliver high-quality impact. Examination of the REF scores allows a comparison of performance based on academic quality of a researcher’s work and on the societal benefit their work provides. There is some variation, so it is clear that academic quality and impact measure different aspects of research performance. But in general, researchers who produce work of the highest academic quality also deliver on impact. That doesn’t necessarily mean that impact only derives from research of high academic quality, but it does suggest that the two activities go hand-in-hand.

Lesson 6: The systematic collection of impact data has generated an important national asset. The wide-ranging collection of information on research impact gives the UK and its universities an important strategic advantage. We now have the potential to understand research impact at an unprecedented level. We can use that insight to improve national research performance even further.

Importantly, the impact case studies also provide evidence of the value that comes from government investment in research. In the coming months I hope the case studies will be crucial in shaping the government’s decisions about future investment in research.


This article first appeared in Funding Insight on October 27, 2015 and is reproduced with kind permission of Research Professional. For more articles like this, visit www.researchprofessional.com.