February 2022
This month’s reports summarises four articles that in/directly relate to the impact of universities. The first paper compares the consequences of research evaluation systems in the UK and New Zealand (Chatterjee et al., 2020). The second study looked at the influence of performance based research assessment in artistic disciplines in Poland (Lewandowska & Kulczyucki, 2021). The third piece of research studied what type of websites were used as evidence for impact case studies in 2021 (Kousha et al., 2021). The last item represents an operationalisation of research impact measures in occupational health and safety (Sørensen et al, 2022).
[1]) The paper focuses on the effect of research assessment (RA) within the UK and New Zealand with a special focus on the accounting discipline. They draw upon 19 interviews with accounting researchers at different stages in their career in both countries. Their findings show that researchers adopt different strategies to deal with the pressures imposed by RA, such as: treating it as a form of fetishism only relevant to the managerial class, as a simulacrum where RA becomes the de facto reality, or as a transformative mechanism necessary for operationalisation of research. In general, all researchers expressed a range of criticism and the difference between the countries were only superficial, noting that the academic community may tear: “apart if it is not acted upon quickly as individualization continues” (p.25), being especially detrimental for early career researchers whom are denied the possibility to develop their own intrinsic motivations for research.
[2]) They quantitatively compared the output data of artistic disciplines in Poland (in 2013 and 2017) on 126,894 outputs submitted by 93 arts faculties. They found that there was a “‘scientification’ of publication practices in the arts, by which [they] mean a tendency to adopt publication practices typical for science-oriented academic disciplines.” (p.294). In practice, this meant the preference for written journal publications, and a quantitative increase in their number. They contest that the metrification puts artist in Poland into a difficult position as their utility becomes comparable to the science disciplines, whom institutionally they hitherto have been apart from. Furthermore, they also question what such a change in publication pressures does to the quality of the produced artistic outputs, for their indirect influence that is not directly translatable into quantitative data.
[3]) They semi-automatically classified the 29,830 website links in the 6,637 case studies submitted to the Research Excellence Framework in 2014. Finding that, business, news and government websites were used across the board of all academic disciplines to evidence research impact. There were specific disciplinary differences with for example social sciences leaning heavily on government websites and health sciences on NHS links for example, nevertheless there was a mix across all disciplines. Furthermore, the websites were heavily UK centric (especially in terms of government links) suggesting that the availability of data is a bigger factor than where the impact occurred. Finally, due to the prevalence of some websites over others in specific discipline there might be a trend establishing itself of ‘natural’ evidence for certain types of impact claims.
[4]) According to them, they developed the first generalised and context-independent tool to measure the societal impact of research quantitatively. They operationalized research as applied research understanding impact as the product of “research with the purpose to answering specific questions and addressing practice and/or policy needs” (p. 119). Yet, when showcasing the functionality of their model, they needed to accept that they only can measure intermediate outcomes that function as proxies for the impact claim. Their theoretical expectation of a separation between conceptual, instrumental and strategic impact were not born out in practice, as practitioner feedback intermingled all these categories. They acknowledge that their model ought to undergo further specifications and be supplemented by qualitative measures such as impact case studies.
[1] Chatterjee, B., Cordery, C. J., De Loo, I., & Letiche, H. (2020). The spectacle of research assessment systems: insights from New Zealand and the United Kingdom. Accounting, Auditing & Accountability Journal.
[2] Lewandowska, K., & Kulczycki, E. (2021). Academic research evaluation in artistic disciplines: the case of Poland. Assessment & Evaluation in Higher Education, 1-13.
[3] Kousha, K., Thelwall, M., & Abdoli, M. (2021). Which types of online evidence show the nonacademic benefits of research? Websites cited in UK impact case studies. Quantitative Science Studies, 2(3), 864-881.
[4] Sørensen, O. H., Bjørner, J., Holtermann, A., Dyreborg, J., Sørli, J. B., Kristiansen, J., & Nielsen, S. B. (2022). Measuring societal impact of research—Developing and validating an impact instrument for occupational health and safety. Research Evaluation, 31(1), 118-131.