In last week’s Times Higher, we had one of our rare mentions. However, it wasn’t to publish good news. The most recent data on employment of graduates from the 2012-13 cohort have been published by HESA and these showed:
“According to the Higher Education Statistics Agency, 92.1 per cent of university leavers were in employment or further study six months after graduating in 2012-13, up from 90.8 per cent in the previous year.”
“Universities with the lowest employment and further study rates are London Metropolitan University (81.4 per cent), the University of Bolton (82.4 per cent) and Staffordshire University (84 per cent).”
This doesn’t look great as a headline statistic, particularly with all the work that colleagues have done on promoting the Staffordshire Graduate attributes, in particular the employability programme on a number of champion awards.
I decided to have a look at some of the numbers for the last few years (recognising that we are starting to see an improvement in graduate outcomes and employment as described in league tables) so there seems to be an anomaly, and consult with those who know more than me.
In league tables, the career prospects score relates to percentage in graduate level work and higher level PG while employment indicator in HESA data relates to percentage in (any) work or further study. So it is possible for us to have a comparatively low employment indicator with an improving career prospects score.
So that starts to explain why scores in league tables are different.
Another really important factor, and one which is ignored in the Times Higher article, is that because institutions are not directly comparable, then results cannot be compared directly. The benchmarks for institutions also need to be taken into account, which allow for subject mix etc.
“if the benchmarks were ignored such comparisons would not take account of the effects of different subject profiles or the different entry qualifications of the students. In general, indicators from two institutions should only be compared if the institutions are similar. If the benchmarks are not similar, then this suggests that the subject / entry qualification profiles of the institutions are not the same, and so differences between the indicators could be due to these different profiles rather than to different performances by the two institutions.”
So what we could do is to look at our own performance is consider how our scores differ from our benchmark score. The table below shows this.
|Employment indicator (including further study)|
|+/-||total UK indicator||missed benchmark? (%)|
So although we have a relatively low benchmark compared with many in the sector, it turns out that we also miss the benchmark by a greater percentage than others.
I was hoping that playing with the data like this might provide better news – what it provides is maybe a better understanding.
So the questions we need to be asking are all about how we can be better than this?
Another publication this week was from the University Alliance (a mission group of entrepreneurial universities) entitled “Job Ready” which provides a series of case studies of interactions between universities and employers, highlighting programmes co-designed with students and employers, and showcasing a range of internships and placements.
There’s possibly nothing in here that we don’t do already, but the question might be about how much we do of these things? Do we develop great ideas but then run them in pockets of isolated excellence?
Inevitably there are those who say that we shouldn’t reduce higher education and the value of a degree to just being a passport to graduate employment, and that higher education provides a much greater range of benefits. I agree totally, but this is one of the key measures used by prospective students, and I’m pretty sure that many of our students still see a degree as an opportunity to better employment prospects.
So, in conclusion, why are we not performing as well as we could against this measure of student success?
Maybe it’s linked to the relatively low number of good degrees we award, making it harder for our graduates to get interviews?
Maybe it’s because of our relatively high number of local students, from an area with higher unemployment and lower aspiration?
In terms of what we could do next, here’s a few ideas:
- Extensively mine the subject level employability data when available and compare with national figures, to see if there are any trends we need to investigate
- Evaluate the effectiveness of the Staffordshire Graduate Employability Programme – will we see improved employability in these award areas compared with previously or other providers of the same subject.
- Review the effectiveness of the SG attributes programme in all awards – do we have qualitative and qualitative research on effectiveness (we’ve published papers on the programme, have we as an institution learnt from these?)
- Use and share internal portfolio performance review data and external publications in a more systematic manner.
Everyone teaching at this university wants our students to be successful, and so making sure that our graduates have the best opportunities in the job market is essential. This links to the talks I give in faculties on “we can be better than this”, graduate employment will be the final outcome of getting everything else right. And if we get it right, and can claim to be the university which gives you the best opportunity for employment, then we might find it easier to attract students to us.