@TomSrOfBoston They must adjust for the size of the school, otherwise how could Caltech be 3rd? (OK, I haven’t looked at the methodology…)
“Ask hiring managers in Philadelphia about Reed or in LA about Bates…and you (generally) get blanks. The Flagship schools will climb to the top of every chart, as their graduates (with larger numbers) will reinforce the value of their own educations.”
Reed and Bates are primarily grad school prep colleges. Ask the hiring managers where grads got their MA/MS/PhD.
@pupflier
Northwestern is ranked 20th by the Times. Why are you so hell bent that this is all about Schapiro being unhappy with ranking? To me, he’s just trying to raise public awareness that USN ranking should be taken with a grain of salt.
The Times rankings are always interesting, but they have their own issues, including being “gamed” by universities.
33% of the rankings are based on Reputation Surveys, and everything is getting “normalized”.
The worse gaming happens with the Citation (30% of the overall score) metric:
The Times ranking, along with many other international rankings, are jokes, particularly on the research category.
First of all, universities are in the business of scholarship, not just research, more specifically, not just journal article publication (The Time ranking only counts journal articles). Scholarship include at least grant generation, journal article publication, books (think English department), performance (think music and arts department). The Time ranking only counts journal articles (also note that the value of a journal article varies a lot across different fields), so it biases against those universities with a big component of humanities. As a result, you see MIT, CalTech, etc. rank at the very top. I am not saying these STEM focused universities are not world class universities (they are), but the ranking provides no value for meaningful comparisons among universities that come in different sizes and flavors.
Another funny thing about the Times ranking is that 6% goes to research income. Grant is probably the only meaningful component of research income. In other words, if a universities encourage its English professors to publish novels and poems, it will not be counted; if a university encourages its business professors to publish in elite business journals (almost all business research yields no grant), it will not be counted. The entire Times ranking is basically a quasi-research ranking mostly in the area of STEM, and no more than that.
I used to serve on my university’s scholarship committee. If my university does this kind of research/scholarship comparison among the units within my university, I can tell you, there would be a big revolt, because this is BS for those outside people who do not know the entire scope of university offerings.
Also, go to any elite university’s department website and count an engineering (or other fields of STEM) full professor’s publications. Then go to the same university’s department website in business and count a business full professor’s publications. You will note that an engineering professor often has more than 100 publications, but a business professor often has just a few dozens. Also note the average number of pages for each publication. An engineering article is usually much shorter than a business article. A typical lead time for publication in an elite business journal (from the birth of idea to the actual publication) is about 4-6 years today.
I am not saying journal article in one field is more rigorous than the other. What I want to point out is that across fields, journal article publication is perceived and done differently. Lumping all of them together making little sense.
2018 ranking released. Uchicago again tied for third with Yale.
@Chrchill please stop doing this
“The Op-ed (which is excerpted from a book by Northwestern’s president and a professor there) comes off not so much as an attack on US News but more like Northwestern whining about all the other colleges being dishonest and gaming the system.”
NW fills 50% (!!) of its seats through binding early decision. That’s the single strongest tool you can use to game your USNWR ranking.
People in glass houses…
@northwesty The percentage is around the same ballpark as those at Ivies, Duke, JHU, etc. The focus of the article was dishonest accounting and misreporting. There’s no misreporting in ED.
Well, I think Northwestern University may have valid questions they’re bringing up regarding the highly popular USNWR rankings & methodology. However, US News does seem to take great pains to accurately assess and report their data. I’m not saying the data analysts at US News never make errors …and if so, I’m sure it is no doubt unintentional!
But Northwestern University’s published criticisms are concerning because I’m not sure I entirely agree with their stance on this issue and how it impacts the university. For instance, Northwestern reports that they meet 100% demonstrated need for all successful applicants, but I’ve heard from a relative and others information to the contrary.
Still, it’s a valuable discussion to have on College & University Rankings because it helps everyone to do their own homework (i.e., comparisons) between the institutions their S or D is interested in possibly attending & what those places report about their institutions of higher ed.
It also may not work well for places like Brown, where courses are pass/fail.
Sorry my edited post didn’t make it! I meant to write that Brown University which is slightly lower than Northwestern University in the USNWR 2018 rankings has a different format for grading, etc. which may or may not influence factors like retention and graduation rates. So it is a bit like comparing apples to oranges. I also think Brown should be much higher in the overall rankings than NU (even though I attended Northwestern & not Brown).
lovesoldbooks, I am not sure how you came to the conclusion that Brown should be ranked “much higher than NU”. How do you figure? Let is say Brown were somehow ranked #6. For Brown to be ranked much higher than Northwestern, Northwestern would have to be ranked out of the top 20. I cannot think of more than a dozen universities that can claim to be better than Northwestern, and that would be stretch. According to the experts, Brown and Northwestern are peers. I suppose it depends on one’s personal preference (academic and non-academic). For some (students seeking a liberal arts education, or who like the ultra liberal vibe or the absence of a core curriculum, or students who wish to live and work in the Northeast after college), Brown is a better option, while for others (Engineering, Economics, Chemistry, Journalism, Music majors, students who just want to be close to a major city or those who wish to live in the Midwest after college), Northwestern makes better sense.
^If you are saying that Brown should be ranked much higher than NU, then you mean that Brown should be a top 10 school or that NU should be ranked much lower than it currently is. I don’t think Brown is seen by most as a real top 10 school or that it should be in fact in the top 10. I don’t think Brown would deserve to take the place of any of the schools that consistently rank in the USNews top 10. Also I don’t see why NU should be ranked lower than the #11-#15 tier. It is a very strong school, in fact stronger than Brown in many regards.
If I were a college administrator, I would want to collect accurate data about my school’s performance.
(If you can’t measure it, you can’t improve it.)
This can’t be done by one or two people alone. Having gathered accurate information, to then deliberately report it incorrectly probably would require the collusion of multiple people.To avoid suspicion it would have to be done consistently, year after year. How likely is this to happen, on a wide scale, in large communities of people who take pride in analyzing information accurately (esp. if many of the same numbers are reported to Moodys, the DoE, etc.)? I think what is more likely to introduce errors (or confusion) is imprecision in the CDS/USNWR instructions. If the instructions seem to leave even a little room for more than one interpretation (or reporting option), then some schools will take advantage by following the most self-serving approach.
Does this affect the ranking results very much? With respect to many small individual ranking positions, maybe.With respect to the overall set of top colleges, even, it might affect an entire class of schools (like top state universities) enough to shift all their rankings one way or another, in or out of an arbitrarily-defined tier. Has anyone here found any data, other than research production numbers, that clearly bump Berkeley into the top 10 or Michigan/Virginia/UNC-CH into the top 20 (where the USNWR peer assessments place them)? Regardless, for most people who can attend one of these schools at in-state rates, it’s rational to think of them as top colleges in building an application list.
For many excellent students, the most important set of top N colleges shouldn’t be the USNWR T20, but the set of ~60 that claim to meet 100% of demonstrated financial need. Maybe those claims deserve at least as much scrutiny as the magazine rankings get.
I agree with you! And speaking about colleges and universities that meet 100% need – which seems to be a very significant issue for many families – I fear the list may go down in number … instead of up in the coming future (based on articles about endowment sources, etc.). But I hope I’m wrong on this.