To bring together disparate work on the impact agenda to critically reflect on the controversies, consequences and challenges that are arising;
To reflect on our own role, as academics, within this;
To collectively propose an alternative approach.
And the last point was the focus of our recent SKAPE seminar, and this linked blog, in which we set out an alternative approach to research impact – one which we feel is more evidence-informed and sustainable and which, while inevitably still imperfect, addresses the major drawbacks of the current approach.
What might a more evidence-informed approach to supporting, assessing and incentivising research impact involve?
Since we each brought our own experiences, preferences and convictions to the book, it was not easy for us to identify a way forward that we could all support. However, we all agreed that the following changes would better reflect what we know about both the complex relationship between academic work and real-world change and the consequences of audit regimes and performance assessment.
Reward impactful environments, rather than individual achievements: The requirement of impact case studies to demonstrate and document change inevitably narrows the scope of impact activities and outcomes. We should, instead, strengthen a focus on how universities create impactful environments; workplaces that are outward-looking, open and engaged with the world beyond academia.
Value a wider range of activities, especially around public engagement: The NCPPE is leading the way in both supporting academics to do public engagement better and encouraging the research audit process to value it fairly. There is more work for to do and, we suggest, it may be desirable to go further, recognising the impacts of the university’s wider role as an ‘anchor institution’.
Protect spaces and funding for critical and discovery focused academic scholarship (without obvious impacts): There are many examples of academic scholarship that is valuable for reasons other than impact. This includes critical, theoretical and experimental work (some of which, as several interviewees pointed out, contributed to major impacts at later dates).
Reject crude and simplistic classifications of ‘excellence’ (which, for example, denigrate the local): The quality of research in REF promotes the idea that wider geographical relevance equates to higher quality. Although most recent impact case study guidance notes that there is value in ‘having a big impact on a small group of people’, our interview data suggest academics tend to believe that those focusing on local impacts are not viewed as “impact stars”. If we want universities to be active members of their local communities, this could, and should, be changed. It should not be the proximity of external communities that is key to assessing excellence, but relatability to potential research users such as communities of policy and practice.
Weaken the link between original research and impact to encourage knowledge synthesis and collaboration: There are very good reasons to support mechanisms that allow for bodies of work to achieve greater influence than single studies. Yet the approach impact of both REF and the UKRI funders appears to do much more to encourage the impact of individual research projects than work to pool and synthesise knowledge for external audiences. We suggest that research funders and REF assessors are encouraged to do more to value academic scholarship that focuses on knowledge synthesis.
Develop a conversation about the ethics of impact: The approach to research impact being taken in the UK and elsewhere appears to assume that if research is ‘excellent’ then the impacts will inevitable be positive. Yet, there are plenty of examples in which excellent research has had deleterious societal impacts so we need to develop conversations and tools that allow us to meaningfully consider the ethics of research impact.
Defend and promote academic rigour and autonomy: Researchers already exist within governments, NGOs, think tanks, private companies and often produce rapid, responsive research. We worry that some of the impact incentives encourage academics to shift towards this kind of responsive research to such a degree that it risks blurring the role of academics with consultants. What exactly is unique about academic research and scholarship will vary by discipline and field but, if we want to maintain a distinction, we all need to get better at articulating and valuing our USPs.
Create spaces in which valiant failures are celebrated and learned from: The current form of impact assessment in REF, and the high financial value of impact case studies, combine to prompt institutions to focus on tried-and-tested pathways to impact. We suggest funders and universities should do more to promote innovation in engagement and knowledge exchange, encouraging contributions that are about learning from challenges and failures (as well as successes).
The above suggestions are not exhaustive but they are intended to be a starting point for discussing how we might improve the current approach to research impact. We, as academics, are involved in constructing, enacting and reviewing impact’s performance indicators and we therefore have opportunities to reshape and improve the current approach. These are opportunities we should take.
–
Kat Smith is Professor of Public Health Policy at Strathclyde University, with a longstanding interest in the relationship between evidence, expertise, policy and practice, especially for issues relating to public health and inequalities.
Justyna Bandola-Gill is a post-doctoral researcher at METRO, where she explores the production and governance of poverty indicators, with a wider interest in how knowledge is organised, governed and mobilised across different settings.
One thought on “What would a more evidence-informed impact agenda look like?”
We use cookies to ensure that we give you the best experience on blogs.sps.ed.ac.uk. You can adjust all of your cookie settings by navigating the tabs on the left hand side.
Strictly Necessary Cookies
Strictly Necessary Cookie should be enabled at all times so that we can save your preferences for cookie settings.
If you disable this cookie, we will not be able to save your preferences. This means that every time you visit this website you will need to enable or disable cookies again.
Google Analytics
Some sites at blogs.sps.ed.ac.uk use Google Analytics to collect anonymous information such as the number of visitors to the site, and the most popular pages. The data help us improve the experience of using our site.
Please enable Strictly Necessary Cookies first so that we can save your preferences!
Privacy and cookies policy
Please see the School of Social and Political Science's privacy and cookies page. In addition to the cookies described there, blogs.sps.ed.ac.uk also uses the following cookies:
WordPress: wordpress_test_cookie
WordPress is the content management system (CMS) used to build blogs.sps.ed.ac.uk
Wordfence: wordfence_verifiedHuman, wfvt_
Wordfence is a WordPress security package
Twitter: personalization_id, guest_id, external_referer, ct0, _twitter_sess
Twitter is a microblogging social media service
Pingback: What would a more evidence-informed impact agenda look like? Response from an “impact professional” | SKAPE: Centre for Science, Knowledge and Policy at Edinburgh