Academy of Social Sciences IMPACT AND IMPLICATIONS: THE FUTURE OF RESEARCH EXCELLENCE AND THE SOCIAL SCIENCES University of Liverpool in London, Finsbury Square – 14 January 2016 Dame Janet Finch (Chair, Main Panel C, REF 2014) – Reflections on REF 2014 Context Main Panel C was the largest of the four Main Panels, dealing with o 614 submissions (32.1% of the total submitted to REF 2014) o 14,413 FTE (27.7%) o 52,212 outputs (27.3%) o 2,040 case studies (29.2%) Main Panel C covered 11 sub-panels, whose workload varied from 25 submissions (UoA 18 Economics and Econometrics) to 101 submissions (UoA 19 Business and Management Studies). For the REF as a whole, seventy-five percent of institutions have at least 10% of their research activity graded at 4*; the top twenty-five percent had at least 30% of their work at 4*. Workload / approach Overall, it was felt that while daunting, the work was ‘quite doable in the end’. Main and sub-panel members were reported to be ‘very careful’ about identifying and reporting any conflicts of interest, and making judgments based on the evidence presented to them rather than taking into account pre-existing or other external sources of evidence. It was also an anonymised exercise – the identity of submitting institutions was not revealed until the end of the process. The key role of the Main Panel, in its Chair’s opinion, was to ensure consistency of approach – not the same thing as consistency of outcome – across its sub-panels. Outcomes – overall Main Panel C determined that 27% of the activity it assessed was 4* quality, slightly more than for the REF overall which was 22%. At 3* quality, Main Panel C identified 42%, slightly less than for the REF overall which was 50%. Within Main Panel C sub-panels, the range of activity rated at 4* was not pronounced. The lowest-rated UoAs, overall, were UoA 26 (Sports Science, Leisure and Tourism) at 25%, and UoA 19 (Business and Management Studies) at 26%; the highest-rated UoAs, overall, were UoA 18 (Economics & Econometrics), and UoA 25 (Education), both of which assessed 30% of their submissions at 4*. Overall, the panel recognised the value of diversity in excellence across the social science disciplines. Outputs Of the 52,212 outputs assessed, the following output types appeared: o Journal articles – 42,217 (80.9%) o Books – 4,263 (8.2%) o Book chapters – 2,962 (5.7%) o Research reports – 505 (1.0%) o Edited books – 422 (0.8%) 1 o All other types – 1,843 (3.5%) No particular output type received a higher rating than any other output type The nature of the work assessed (e.g. theoretical, empirical) did not have an influence on the rating. Overall the panel identified clear evidence of improvement in the methodological underpinning of the discipline between RAE 2008 and REF 2014. Impact In assessing impact, Main Panel C determined that 39% of impact should be rated at 4*, slightly less than for the REF overall which was 44%. At 3* quality, Main Panel C identified 40%, the same as for REF overall. A wider range of 4* activity could be seen for impact than overall. The lowest-rated UoA for 4* impact was UoA 17 (Geography, Environmental Studies and Archaeology), rising to 44% for UoA 22 (Social Work and Social Policy). The narrative structure of the impact case studies had been helpful in allowing institutions to tell a nuanced story, while recognising that links between research and impact are not simple and are not always linear. The case studies, and the wider approach taken to impact had, overall, been very effective. It was acknowledged however that that the template had nonetheless favoured case studies that did have a linear link between the underpinning research and its impact. It did not appear to have been the case, as some commentators had feared, that the best impacts would come out of supporting the status quo. There was as many good examples of where research had led to significant change as there has been of research which had supported the existing position. The additional burden imposed by the case studies on the REF by comparison to the RAE was recognised, but it was felt that this burden was outweighed by the fact that the narrative approach had worked well for the social sciences, and that any impact assessment taking a different approach would have negative consequences. There was discussion of the possibility of submitting case studies to panels other than those to which the underpinning research most obviously belonged. It was noted that the REF criteria had talked of the underpinning research having been done ‘by the submitting unit’ and although there might be such an opportunity for impacts arising from research done by colleagues not submitted by an institution to the REF, it would not be possible for Professor X to be submitted to UoA Y, but examples of impact arising from his research to be submitted to UoA Z. Interdisciplinary Research A number of case studies were identified that had been submitted to one of the Main Panel’s sub-panels, but could reasonably have been submitted another one. This was given as evidence of the fluidity of the boundaries between the sub-panels, and the existence of strong interdisciplinary activity across the Main Panel. However, institutional use of the flags to identify ‘interdisciplinary’ outputs had been inconsistent. Some institutions had used them frequently, others not at all, but it was not clear why there had been this erratic practice. It was suggested that taking a thematic or issue-based approach to research might be a more useful way of seeing interdisciplinary activity. The Future of REF and implications for the social sciences Continuing with some form of separate REF exercise in future was strongly supported. Learned societies, subject associations etc had a crucial role to play in the assessment of the discipline by putting forward sensible, well-considered names of individuals to serve on panels. It was not helpful to put forward swathes of names because someone else – without the same disciplinary knowledge – would end up deciding who to approach. This would not benefit the social sciences. 2 It was similarly important to ensure that whatever criteria were developed, they would be of benefit to the social sciences. This meant, for example, continuing to ensure that all different types of research output were recognised equally. It also meant that until the criteria were known it was impossible to comment meaningfully on the proposal for an interim exercise. The social sciences had nothing to fear from impact, especially if the narrative approach was retained, and even if the value of impact were to increase. Jonathan Grant (KCL Policy Institute): “The Impact of Impact” Impact in REF 2014 HEFCE commissioned RAND Europe and KCL to investigate the impact aspects of REF 2014. Extensive work was done at KCL to model and identify topics case studies covered, and inter-relationships between them. Some three thousand unique pathways were identified between the underpinning research area to the impacts – this would mean that in order to use metrics in impact assessment, three thousand metrics would be needed. Across the c.7,000 case studies submitted to the REF, the KCL study identified only 18 pairs – i.e. where the same case study had been submitted by two HEIs or UoAs. The Stern Review The only real purpose of the Stern Review of the REF is to see if the way in which the REF is conducted can be changed, to make it cheaper. Its Terms of Reference miss the most important point, however: what would be an acceptable cost for the REF? If that was known, a system that operated within that cost could be met. As Grant comments in an opinion piece in the THE (7 January 2016), if we want a performance-related allocation of QR, we need some form of evaluation process. In Grant’s view, that could be metrics-based, despite the conclusions of the Metrics Tide. What we need to recognise is that comparing an imperfect metrics review process with an imperfect peer review process does not necessarily give us that answer. A more effective means of assessing impact might be on the basis of socio-economic panels rather than discipline based panels. Steven Hill (HEFCE) – The Future of the REF (in HEFCE’s view) The changing landscape – knowns Dual support will remain (but quite what will dual support look like in future?) There will be a new umbrella body, RUK, to oversee the Research Councils Innovate UK will become part of UK There will be a peer-review based REF by 2021, and it will involve impact The changing landscape – unknowns The future of HEFCE The future of the individual Research Councils under RUK Who will have responsibility for QR allocation? What will the powers of the RUK Chair and Board be? The role of the proposed Ministerial Committee to oversee RUK – will this erode the Haldane Principle? Thanks to Stern we know less than we thought we did about how the next REF will operate. HEFCE’s position on the next REF 3 Until October 2015, HEFCE were anticipating a REF2020, that would be an evolution of REF 2014, subject to refinements that would be consulted upon. HEFCE planned to consult on the following – the document was ready to go when they plug was pulled: o Refinements to the UoA structure (for example, configuration of the Engineering sub-panels; linking Geography and Environmental Studies with Archaeology – with an expectation that if the number of UoAs was to rise, it would be to a maximum of 40. o Ensuring better representation on the assessing panels, both in terms of equality and diversity, and the institutions involved. o Whether staff should be selected in the next exercise, or if all eligible staff should be included; and if the latter, how to reconfigure the link between submitted staff and outputs. (The Technopolis report demonstrated that staff selection processes were the most costly part of the REF exercise). o Requiring ORCID identifiers o Ensuring better reward for collaborative activities (especially between HEIs and external organisations) o Better handling and recognising interdisciplinary research o Increased use of metrics – but judicious use – in output assessment o Impact – improvements to the rules and regulations; whether there were means of reducing the cost and burden of the impact component; moving the impact template to the environment section; whether to recalibrate the ratio of case studies to volume of staff submitted; how to better enable access to evidence; whether to increase the granularity in scoring; the extent of ‘resubmission’ of impacts (maintaining a balance between banning all ‘old’ impacts and discouraging ‘new’ impacts*) o Introducing additional metrics for the environment component o Reconsidering the component weightings, in particular where to account for the proposed increase in impact weighting to 25% o Better aligning assessment dates to other data collection activities. HEFCE thinks that it is quite probable that the Stern Review will merely conclude that the REF is an acceptable assessment process and the net effect will be only to delay the next REF by one year. * In the following Q&A, Hill expressed the view that it should be possible to submit the continuing impacts of research previously submitted, but it would be self-limiting because the underpinning research would need to remain in date, and the continuing impacts needed to be new and distinct. Interdisciplinary Research Work done by Digital Science (see their report, published 21 December 2015) shows that a large proportion of the case studies submitted to the REF drew on social science type impacts, even though they weren’t submitted to social science UoAs. However, work done by Elsevier demonstrates that interdisciplinary research gets lower citations than non-interdisciplinary work. HEFCE was considering a number of ways to further support interdisciplinary research, and intended to consult on these in preparation for the next REF: o Appointing interdisciplinary ‘champions’ to each sub-panel who would ensure that it got the appropriate treatment o Allowing staff and/or their outputs to be submitted to multiple sub-panels o Making the use of the ‘interdisciplinary’ flag mandatory o Making coverage of interdisciplinary research a part of the environment statement, asking what institutions are doing to support and enable it. 4 Panel Discussion: The Way Forward and the implications of ‘Fulfilling Our Potential’ Note: this was all a bit rushed. The three speakers were supposed to have five minutes each to set out their stall and then for there to be plenty of time to discuss, but in the end all three over-ran their time significantly and as the conference was already running late, discussion was curtailed. James Wilsdon (Chair, The Metric Tide review) Five issues with metrics o Coverage across the disciplines remains patchy – Main Panel D is particularly problematic; a two-tier REF system would be a nightmare, adding burden and complexity. o There are no viable metrics to capture impact o The way in which metrics can mis-represent will have an effect in equality & diversity terms. Algorithms can’t cope with the ‘human dimension’ (such as are represented by the individual staff circumstances processes). o The cost-saving argument around metrics is a myth – at best they will be negligible, especially if institutions rush to buy new products or services to enable them to better second-guess the metrics outcomes. (And if this is really a cost-saving exercise, then perhaps the government should look at RCUK funding, which already costs a great deal more, as a proportion of the money handed out, to administer). o A radical switch in approach to a metrics-based REF would mean that the government’s undertaking to run the next REF by 2021 could not be met. A new system of assessment would require developing, piloting and testing to ensure it is robust – and there simply isn’t time. The logical outcome of the government’s thinking as seen thus far (Green Paper, CSR, Nurse Review, Stern) is a non-selective, bibliometric-based exercise. This would be a disaster. Sasha Roseneil (Birkbeck) Three suggestions for the next REF to promote interdisciplinarity o Get rid of the environment template altogether – it unfairly rewards singlesubject, or single-department-based, activity. o Set up a new ‘interdisciplinary’ UoA o Submit all staff Ceridwen Roberts (Oxford) Comments on impact, as an impact assessor for UoA 23 Sociology in REF 2014. o It was glaringly obvious which case studies had been written by expert writers rather than academics o Impact is here to stay and it is essential to engage with it, but it is just as essential to define the terms on which the social sciences do so. Slides from the event and a report from AcSS should be made available in due course. 5