The biggest problem with all these lists is - this is NOT a zero sum game! Just because one school is good does not mean another school can’t be good. It’s hair-splitting at times.
The most meaningful/realistic way to “rate” schools IMHO would be to group them into tiers or levels of selectivity/performance. But that doesn’t sell magazines 8-|
Just because USNews was first doesn’t mean that it’s methodology should be the gold standard.
Anyway, @harvestmoon, yep, grad school is a related but different ballgame. It does mean that there are some (hidden) gems for kids who know absolutely what they want to study/do. The quality of the student body of those programs may be quite different from the overall student body at those places, however, and most teenagers don’t have a clue what fields really fit them, but there are terrific opportunities (both in the US and abroad) for those who do the research.
More constructive is to assess the impact of the methodology, and consider why some schools under/over perform on this ranking. CalTech, for example, has the greatest proportion of students who earn a Ph.D. (see http://www.thecollegesolution.com/the-colleges-where-phds-get-their-start/), but Forbes only counts that for 2.5% of their scoring. Likewise, CalTech (and tech schools in general) have somewhat lower retention and graduation rates, due to the rigor of their curriculum. This hurts them in these (as well as the USNews) rankings.
Data-based rankings are welcome as long as they publish their methodologies and refine their models year after year. The rankings offer us more info about a particular school even though without a perfect, complete picture.
If a school makes top X (e.g., 25) on any ranking, then it’s likely a decent school for a top HS student.
“Hopkins ? #66
When its something like #10 on US News list. Big disconnect there.”
All it means is that the rankings were based on different characteristics. That’s all.
If I created a ranking of colleges most likely to get my wedding announcement in the NYTimes, it would have some resemblance, but a lot of dissonance, from USN or Forbes.
If you can’t “accept” a different ranking unless it looks just like the first ranking you were aware of, it says that you’re not really interested in looking at a different way to rank schools.
One ranking I’ve always been curious about and have never seen (and would be of limited value) would be something like a standardized “GRE - SAT” to see how much each school “improves” its students. Same for ACT, LSAT, MCAT, etc. It might be interesting.
However, it’s harder to improve at the highest ends.
And frankly the “better” school would be the one that transforms students who were (say) 500/500 to 650/650, not the one that transforms students who were (say) 710/710 to 750/750.
@bclintonk In order to do a proper comparison, the schools in the Forbes ranking first need to be divided as they are in the USNews list (i.e., into National Universities, National Liberal Arts Colleges, etc.) and then a new rank assigned. I think some of your examples of apparently big discrepancies (CMU, USC, Brandeis, BU, GA Tech) would then largely disappear.
On a thread like this posters reveal more about their personal bias than insight into the metrics of a list. All the schools on the list are excellent institutions. It is more important for a student to find the right fit than argue numbers on a list.
It still doesn’t matter, except at the very grossest of levels, because personal preferences are subjective. It’s like ranking flavors of ice cream. If more people like vanilla than chocolate, what does that have to do with anything if I’m into pistachio? That’s what I think is most funny when people start talking about who “wins” cross-admit contests, as if I’m supposed to make my decision making based on what other people do.
If somebody publishes a ranking of “Best countries to live in”, explains their methodology in cupious detail, but the list has Bangladesh and Somalia 20 ranks higher than Sweden and Canada, the question is other than the folks living in Bangladesh and Somalia, who would think that the list was credible?
You see at the end of the day, if a majority of the folks reading your list, don’t buy your ranking ( aka, you have not reinforced their perceptions) then your ranking is not going to be very credible.
I think the reason the USNews ranking carries so much weight is because it reinforces the pre-conceived notions of people on who they think the top 10-15 universities in the US are (Doesn’t have to be the exact ranking but just the same list). Once you reinforce that bias, then you can claim that XYZ University is 37 or PQR university is 84 and they will believe you, because you have credibility in their minds.
I think the Forbes ranking fails by this criteria. In its effort to be thought provoking, it has just managed to make people frown at the list.
8 schools west of the Mississippi in the top 50? Other than here on CC, the Claremont schools don’t seem to be well known in California. I’m a native, gone to school in both NorCal and SoCal yet was not aware of them until a couple of years ago when I was researching college stuff on this site. I’d bet a majority of people would think Cal Poly Pomona if someone told them they went to Pomona. Maybe LA-area people are more aware of the consortium? Any other Californians here have similar experience of not knowing very much about the Claremont schools? That said, I guess that’s a good thing about these lists in that you can learn about some good schools that you’ve never heard of.