"What College Rankings Really Measure"
September 14, 2018, 02:31 PM
Print Friendly and PDF

From The Conversation:

What college rankings really measure – hint: It’s not quality or value
September 12, 2018 6.49am EDT

by Jonathan Wai

… Our study also assessed the correlation — or how statistically similar — our test score rankings were compared to the U.S. News rankings themselves, as well as other rankings that are meant to assess entirely different dimensions of colleges and universities.

A correlation of 1 indicates a perfect relationship between two variables whereas a correlation of 0 indicates no relationship between two variables. We found across our analyses that test score rankings correlated between 0.659 to 0.890 with other rankings. This suggests the schools that end up at the top of the test score rankings also will end up at the top of these other rankings.

We first found high correlations between our test score rankings and U.S. News national university rank – 0.892 – and liberal arts college rank – 0.890 – even though U.S. News weights these scores only about 8 percent in their formula. Times Higher Education’s U.S. school ranking was correlated 0.787 with SAT and ACT scores and Times Higher Education’s full international school ranking was correlated 0.659.

I suspect the Times Higher Ed rankings are for entire universities rather than for their undergrad components like the US News rankings are. So, for example, University of California colleges with superstar researching but mediocre undergrad teaching will do well in the THE rankings.

This suggests that the SAT/ACT rankings could function as a common factor that connects all rankings.

But what about other types of rankings that were formulated in very different ways for different purposes?

When we examined the correlation between our test score ranking and a “revealed preference ranking,” which was based on the colleges students prefer when they can choose among them, we found these rankings to be highly related at 0.757.

When we compared the test score rankings to a novel set of rankings created by Lumosity, the creator of “brain games” meant to boost cognitive functioning, we found that ranking to be highly related to SAT/ACT scores as well – at 0.794.

Finally, we examined a “critical thinking” measure – the CLA+ – intended to assess critical thinking among freshman college students. We again found this to be highly related to the test score rankings – at 0.846.

Unfortunately, Wai doesn’t list outliers in this popular article. But here’s his paper and I downloaded the supplementary materials.

For example, Caltech has the highest SAT scores, with a 25th percentile of 1490 out of 1600 and a 75th percentile of 1600. But it only ranks 10th in the USNWR rating. However, it is second in “Revealed Preference,” which I think would also be known as yield: students who get accepted Caltech tend to choose to go to Caltech.

Harvard is #1 in revealed preference. Washington U. in St. Louis is 11th in SAT scores but only 65 in Revealed Preference. I presume that Washington has marketed itself over the years as a safety school for high scorers.

Brown U. is only 22nd in SAT scores, but 7th in Revealed Preference: i.e., it has a pretty well-liked brand among kids who like that kind of thing. BYU is 103rd in test scores, but 21st in revealed preference. Probably not too many students apply to both Brown and BYU.

[Comment at Unz.com]