Faculty salaries. Imagine ranking corporations by how much they pay their executives.
I’ve already gotten an e-mail from the chancellor at Vanderbilt explaining why they should have been ranked higher (funny thing is this is actually what alerted me to the fact that the new rankings are out; apparently I was supposed to be waiting breathlessly to see if my kid’s school had moved up or down instead of…not noticing or caring).
As a technical note, what you are describing is a type of revealed preferences ranking. They are hard to do with proper controls, but with the right data you can do something interesting:
The problem is to do an annually-updated version, you would need to do some sort of annual national survey of college matriculants.
As I recall they quickly launched a counter-campaign last year as well. I am not sure that is really creating the impression they want to create.
Still, the fact that Vandy jumped right on it is a somewhat disturbing sign of the times.
The journalists have a lot of power over the schools because alums seeing their school tumble in the rankings will get really angry with the school.
What’ll be sorta interesting to see is how many years of Merced being ranked in its current, surprisingly high-ish range before those perceptions change.
He also wrote an editorial today.
Although if Merced got a lot more popular with the sort of families typically conducting a national college search (as opposed to just going to their favorite affordable in-state university which admits them), and actually enrolled many of those kids, would it remain as highly ranked in social mobility measures?
For what it’s worth, Vanderbilt is ranked among its natural peers, imo – about where they should be. I have them rated very closely in overall quality with CMU, Emory, Georgetown, Notre Dame, Rice, and Washington U. Those schools are all ranked 18-24 – pretty tightly.
I mean…I don’t think he’s wrong…I just thought the immediate e-mail about it came across as a little defensive. It is certainly true that there’s no value whatsoever in getting people to care or squabble over whether a school is only the 18th best in the country instead of the 14th or whatever (and to reassess their institutional values based on what USNWR says they should care about)
The rankings are based on an arbitrary weighting of various stats. That weighting was manufactured to get subscriptions/magazines/clicks/profit for US news. Like previous years the selected weightings keep HYPSM on top which makes it look correct to their base, but also have enough differences to give their base a reason to see what’s changed instead of just use their previously purchased USNWR ranking from previous years.
If you want to understand the details of why college A is ranked higher than college B or why changes occurred from last year, you need to pay $60 to see the all subcomponent rankings; but you can get some general clues from the listed methodology and the parts that are public. The highest weighted categories are as follows.
- Graduation Rate + Performance – 37% (>50% submit scores) or 42% (<50% submit scores)
- Peer Reputation Survey – 20%
- Financial Resources Per Student – 8%
- Faculty Salary – 6%
- Salary of Federal Loan Recipients – 5%
- Average Federal Loan Debt – 5%
- First Year Retention – 5%
- Test Scores – 5% (>50% submit scores) or 0% (<50% submit scores)
- … <= 3% Weighting
By far the most influential category is graduation rate. The 37% to 42% in this grouping listed above includes direct graduation rate, graduation rate performance, and graduation rate of Pell recipients. The Pell graduation rate component gives more points to schools with larger % Pell, in addition to looking at graduation rate directly.
Graduation is largely a function of admitting students who are likely to graduate and giving them good enough grant based aid such that they won’t need to leave for financial reasons. Colleges that are selective tend to do well in the former, and colleges with high endowment per student tend to do well in the latter.
I’m not going to pay $60 to see the details, but one can look up most of the stats listed above for the colleges that interest them, and weight them however they think is meaningful. For example, if you are really big on having a top 6-year graduation rate, like USNWR, the selective “national” colleges that did best on this metric in the most recent IPEDS are below:
98% – Harvard / Yale
97% – Duke / Northwestern / Princeton
The next highest weighted category is the “peer assessment survey”, which asks colleges admins to rate other colleges (most of which they are not especially familiar with) on a scale of 1 to 5. The highest rate colleges in a previous year were:
4.9 – Harvard / MIT / Stanford
4.8 – Princeton / Yale
After that is financial resources per student. “National” colleges with highest endowment per student are:
- Princeton
- Yale
- Stanford
- MIT
- Harvard
- Caltech
Next highest is faculty salary. Highest faculty salary colleges may be similar to below (not sure how overall vs professor is weighted):
- Harvard
- Caltech
- Princeton
- Chicago
- Yale
- MIT
I also don’t think he is wrong about the first part, the criticisms of rankings that he cites to NORC.
I am more skeptical about the proposal to replace rankings with what he calls “ratings”, however. Indeed, an enormous amount of data is already available to people in forms like the Common Data Set, NCES College Navigator, audited financial statements, and so on.
The real problem, therefore, is not too little data to compare, it is too much!
And at this point, that is the “problem” US News and the like are “solving”. Parents and kids and such want a simple answer as to whether X is better than Y, which are the T-N colleges, and so on. And so US News is giving them one.
And so I don’t think “ratings” are going to replace “rankings” in the market if they don’t compete with an equally simple answer.
I think that vast majority of college applicants or their parents have never even heard of these databases, so they are rarely used. I think most people including myself who aren’t involved in the education industry have largely used circumstantial evidence and hearsay to make decisions. It’s not possible to claim that this method is any better than any other, including ranking systems.
Another algorithm for ranking that ostensibly can be more customized. May be of interest in those who are interested in ranking overall. Shares a lot of the same weaknesses as any system that purports to value a College #45 or College #55 over a College #65. But “best” is the most common search on Google right?
Let’s take a trip down memory lane…Vandy’s statement at this time last year. A few snippets:
- Criteria related to social mobility have been given more weight, such as the percentage of Pell students. Students from all backgrounds succeed at Vanderbilt at a higher rate than at many other institutions, but because Vanderbilt’s overall percentage of Pell and first-generation students is lower than at many state institutions, U.S. News’ metric for Vanderbilt is lower, affecting our ranking.
And this (Vandy’s Pell grant proportion impacts these data too)
Data about earnings, indebtedness and first-generation students are being sourced for the first time from the U.S. Department of Education’s College Scorecard. The scorecard only captures about one-third of Vanderbilt undergraduates—those who receive Pell grants or federal loans.
This is similar to one metric that we looked at when choosing a school for our kid. We looked at the progress grade for schools, especially for their economically disadvantaged students (we do not qualify as economically disadvantaged). The selfish reason of why we cared was because if a school is moving its economically disadvantaged kids well, then it’s moving all of its kids well. In contrast, if a school’s economically advantaged population is progressing well while the economically disadvantaged students are not, then the progress may be more of an indication of the supports being provided at home rather than the quality of the instruction in the building.
Thus, in a world where ranking and rating systems all have issues, this was an area where it was easier to see which schools were providing better instruction & supports than others.
So the graduation rate performance is one metric that attempts to try and correlate with this “progress” factor that our state determines based on state tests, but there are definitely bigger holes with the methodology on the collegiate level. But in the absence of great data, one uses the data available.
Indeed, but that to me is a form of market “answer”. In fact, I know that various college help websites, including this one, have periodically informed parents and students about the existence of these tools, and in fact promoted their use. But they have not taken off, and I think it is because they are not really what most parents/kids are looking for, even if academics and such think they are what parents/kids SHOULD be looking for.
I do sometimes fantasize about someone who takes all these different free information sources and folds them together into a big user-friendly site where you can just manipulate different easily-understood toggles and sliders and such, and it then spits out a customized ranking, which you can then play around with further by adjusting the toggles/sliders/etc. You could even allow inputs like your NPC results. Basically the automated version of a chance me/match me thread.
But that sounds like a lot of work, and I am not volunteering. Who knows, though, maybe AI will take over that task . . . .
The funny thing is, it’s USNWR itself that has solidified people’s impressions about where a school should be ranked. When someone tells me that the current rankings list is crap because “college X” should be at least 20 spots higher, and I ask why: their arguments ultimately lead to the fact that “college X” has indeed been 20 spots higher on this list for many years.
Of course, USNWR is very careful to ensure the top 10-20 remain relatively stable so people don’t outright dismiss the whole list (as happened with WSJ’s latest list).
Perception is reality - one reason why people like Vanderbilt’s chancellor care so much about this issue.
Agreed. But it’s just funny (to me at least, and also a bit sad), to see how much sway a magazine holds over our higher education system.
Of course, I don’t mean to imply that all schools are the same or provide the same outcomes to everyone. My comment is merely about the weight so many people place on rankings.