12 December 2022 Monica Madeley, Projects and Engagement Officer
Over September and October 2022 HESPA consulted on proposals to make changes to the Student / Staff Ratio (SSR) methodology that is currently used by Jisc to supply data to league table compilers. There was a high level of engagement with the consultation process across providers from all countries of the UK and representing a broad range of provider types.
Outcomes
The majority of responses favoured no changes to the existing methodology. Since a strong mandate for change was needed for methodological changes to be made, the outcome of the consultation is that no change will be made to the current SSR methodology at this time.
Yet, despite no changes being recommended, the consultation process has been an interesting and important one. The last review of SSRs was carried out by HESA in 2015 and, despite the role of HESA changing and its remit no longer stretching to such reviews, there is a clear need for the sector to come together to ensure that definitions like this are regularly reviewed and remain fit for purpose. We feel that this consultation has achieved this objective for the current time and we would like to thank all those who contributed.
There were also some other key points raised through this process which may be helpful to consider.
Teaching and Research as intertwined. There was very broad consensus about the intertwined relationship between teaching and research. In many responses, this led to concern that any reduction in the FTE of staff with responsibilities for both teaching and research might create perverse incentives that would undermine this relationship – for example through increased use of teaching-only contracts, or other recoding of staff. This is a real strength that we would not want to see undermined.
Gaming as an inhibitor of progress. A number of responses raised concerns about metrics driving behaviours such as changes to staff contracts, additional work on coding of staff/students, or additional work to provide alternative data to league table compilers (for example through detailed work to record actual time spent on teaching by T&R staff) and evidence supporting use of this alternative. This raised the question of how much the potential for ‘gaming’ metrics or individual provider choices to increase their own burden should be allowed to influence methodology considerations – we’d be interested in hearing views on this.
Simplicity is key, but is consistency possible? Responses favoured keeping the methodology as simple as possible. Responses also showed that there was little consistency across the sector in terms of the proportion of time spent on research by T&R staff, or the proportion of time spent in work-based settings by students on professional courses. So there are few simple rules that apply across the whole sector. This raises a question around whether more could be done to aid consistency in this area, while at the same time keeping things simple – or are we seeking the impossible?
Why HESPA? One response questioned the role of HEDIG/HESPA in leading this consultation. This response suggested it should be for the DDB to lead, but - by its own admission - it is not the role of the DDB to decide on regulatory metrics (this is for regulators to do) and this metric is not currently used by regulators. However, it is used in league tables, which constitute a similarly high profile context, so there is a vacuum here as there is no overarching ‘data authority’ for the sector.
Lack of an independent HE data authority. In looking at ownership of data definitions across the sector, if this should not be led by HESPA, who should it be led by? Through HESPA, the sector has access to a large number of people who are uniquely qualified to be able to comment on the detail of such methodology, and via HEDIG we are able to engage all the most relevant sector bodies. We would be interested in hearing views about alternative organisations which might achieve this outreach and take on this role. Not least because this review is not to be treated as an independent exercise, but rather one iteration in a line of ongoing reviews. We would also be interested in hearing of other metrics that might fall into a similar category of being widely in use, but not centrally owned.
What have we learned? We can question whether institutions should aim to get the best outcome for every metric, or whether acceptance over certain results is an essential part of today’s data driven landscape, but perhaps we can all agree that regular reviews are important to ensure that metrics like this remain fit for purpose with transparent methodology. As the revised data model is established following Data Futures implementation, Jisc will examine the impact of the new model on the current methodology, and a further review is likely to be required (probably in two years’ time).
The consultation showed that SSRs mean different things to different stakeholders. For some they are a proxy for quality, for others a proxy for student experience, and for others a proxy for availability of staff resource available to students. Ultimately though, for all, they will always be a proxy.
On behalf of our HEDIG Chair, Sally Turnbull, and the consultation steering group (listed below), thank you once again to those who contributed. We are also aware of a number of senior colleagues outside of HESPA’s direct membership who have engaged with this work, so please do forward this onto relevant colleagues. A compilation of Q&As based on responses will be published in the new year and this will be kept on file to aid future reviews.
Jonathan Waller, HESA
Dee Jones, Jisc
Jackie Njoroge, University of Salford
Jackie Groves, Cardiff University
Jenny Walker, Loughborough University
Kirsty Roden, Glasgow Caledonian University
Dave Radcliffe, University of Birmingham
Amy Whitmore, De Montfort University
Luan Heggarty, MMU
Jen Summerton, HESPA
Sally Turnbull, University of Wolverhampton and HEDIG Chair