One of the BIG league tables is just out, the QS World University Rankings. The BIG news this year is a change in methodology that means some BIG names drop or climb unexpectedly. So, Imperial drops from 2nd to 8th, Princeton drops out of the top ten altogether, to be replaced by the Swiss Federal Institute of Technology. The change in methodology concerned the way that citations (research work that is then used by others) are counted, so as not to over-emphasise the ‘hard’ sciences and medicine especially. This gives those institutions whose research and reputation is found more heavily in social sciences, humanities or arts more of a chance. This new method works well for some, with the LSE, for example, popping up from 75th to 35th place!
Just to be clear, these changes of place have little or nothing to do with that the institutions concerned have done over the past twelve months — the data collected by QS is on a five year cycle. But if changes that dramatic can occur because of a change in methodology, it does make you wonder just how valid such tables are. Another look at the QS methodology shows some interesting and far-reaching decisions taken, for no particular reason. Some of the most heavily weighted measures are clearly related to the size of an institution. This leaves the mostly smaller UK universities playing catch-up — how can even a large organisation like Bristol compete with Michigan or UCLA both of whom are at or above 40 thousand students. Other measures do not, but the weightings of the various factors (why is this 40% of the score, and that only 10%), just seem arbitrary. See my brief discussions of a similar issue here and also here.
And another thing: with enormous real-terms slashes in funding for arts and humanities over the past five years here in the UK, this table leaves the Government with egg on their faces. Presumably there was a calculation made about how higher education world-wide is judged, but then someone went and changed the rules!