How true is US News Ranking?

During Yale college visit, I stumbled upon New Haven high school teacher, mentioning perspectives on University of Connecticut, Storrs (the CT flagship public university) which according to US News rating has higher rank than University of Massachusetts, Amherst (neighboring state of MA flagship public university) but not neccessarily is better at least among CT public school teachers. Since past few years, UConn is “mandated” to admit more local students which then is seen as lowering the college admission standard; makes UConn is less prestigious. On the other hand, CT teachers applaud high school graduates who are admitted to UMass since the college is known for tough admission process, high quality standard on OOS applicants.

Do you have similar story on your state for share?

true isnt the word to describe rankings - like any list it is subjective

I would break down your topic question into several sub-questions, such as:

  1. how well does the USNWR (or other) methodology capture college quality in general?
  2. how appropriate is the USNWR (or other) methodology for my own college-comparison needs in particular?
  3. how reliable is the data (or other evidence) that supports the USNWR (or other) ranking?

Here’s a link to the USNWR ranking factors and weights:
For national universities and national liberal arts colleges, the most important factors include the peer assessment survey and the average graduation rates. The peer assessment survey is basically an opinion poll of college presidents, provosts, and deans of admissions.

For supporting evidence, USNWR gets much of its data (other than the opinion poll numbers) from the Common Data Set numbers submitted annually by most colleges.

College Confidential has had many discussions about related issues.
Anyone is free to start a new discussion, but you might want to do a search through some of the ones we’ve already had. While some of us do consider these rankings useful, hardly any experienced CC poster accepts the USNWR rankings completely and without reservation (or so it seems to me). IMO they can be a useful tool in building an initial reach-match-safety list, by exposing you to schools you otherwise might not have considered. However, they are not a very good basis for making final application or enrollment choices among ~closely ranked peer schools. Even if a ranking could be 100% accurate and reliable in distinguishing the overall academic quality of, say, UMass v. UConn, those differences might be far outweighed by net cost differences, or by the differences in specific program quality and suitability.

IMO, the US News ranking is nearly useless unless one discards the ranked list entirely and simply uses the list as a way to take a close look at schools. But this is not what the OP is asking. The unstated question is whether the US News rankings are accurate?

The rankings are silly. I will spare all the ways in which the data is either misleading (examining faculty salaries, for instance, as stand-in for an assessment of education quality) or easily manipulable. A simple Google search will reveal these flaws.

No, for me the most obvious flaw, which never gets mentioned, is this: of all the criteria that US News uses, none of them examine what happens in the classroom. This seems absurd, no? How can one even begin to create a list of best colleges but never try to evaluate what gets inside a student’s head? Do students learn? Do they find their classes useful? Are faculty good teachers? Do they enjoy teaching? Do they actual teach more than conduct research? Are faculty accessible? Do students find that their teachers are committed as educators and potential mentors? None of this is considered. How can one come up with a list of best colleges when the education offered is not examined?

This would be like someone putting together a ranking of best restaurants, and to come up with the ranked list, the person examines cost of a meal, difficulty of getting a reservation, the salary paid to the chef, the restaurant’s operating costs, the restaurant’s profits, etc. Then, after crunching numbers, the person publishes her “Best Restaurants” list, and everyone treats it as objective and trustworthy! Of course, eventually, someone asks the obvious: “Um, what about the food? Is the food any good? Did anybody think to evaluate–you know–the most important aspect of a restaurant?”

How would you assess “what happens in the classroom”, or “what gets inside a student’s head”, more directly?

There have been some attempts to assess instructional quality and student satisfaction. Niche for example runs surveys. Maybe the best attempt I’ve seen is the National Survey of Student Engagement, which gets into the nitty gritty of student reading/writing work loads and such.

In general, the USNWR is very “true” if your school is ranked high or where it should be in the ranking, and it’s “useless” and “silly” if it isn’t.

Every ranking is subjective. I always suggest that people look at and understand the criteria used to determine any particular ranking. I also recommend that you decide on what criteria is important to you (including finances) and develop your own ranking of schools based on what you want from your college experience.

I would guess the rankings are reasonably accurate in reflecting what informed people believe IN GENERAL. Like are the 1-20 schools probably IN GENERAL doing a lot of things at least a little better than the 40-60 schools are doing them? Probably, but with plenty of exceptions. Where it goes completely off the rails is when people disregard the “IN GENERAL” part, and assume the rankings are accurate and PRECISE…like when they assume the #40 school is necessarily, absolutely, unquestionably & tangibly superior in all ways to the #50 school.

To be fair, USNWR has separate rankings for undergraduate teaching. Their description:

Many colleges have a strong commitment to teaching undergraduates instead of conducting graduate-level research. In a survey conducted in spring 2017, the schools on these lists received the most votes from top college administrators for putting a particular focus on undergraduate teaching.

Top 10 LACs

  1. Carleton
  2. Williams
  3. Swarthmore
  4. Amherst / Grinnell
  5. Pomona
  6. Davidson
  7. Wellesley
  8. Berea
  9. Colorado College / Haverford / Middlebury

Top 10 Universities

  1. Princeton
  2. Dartmouth
  3. Brown / Rice
  4. Miami U
  5. Michigan
  6. William & Mary
  7. Georgia State / Stanford
  8. Duke / Yale

Why would the ranking for undergraduate teaching vs grad research for LAC’s vary at all from their overall ranking since most LAC’s exclusively have undergrads and don’t do grad-level research?

Indeed, nothing in USNews methodology actually measures what is learned, outcomes (financial and otherwise), and how efficient the school is in providing education. Which is unfortunate because it is probably influencing billions in spending and debt.

There have been some attempts to rank on financial outcomes, but those are probably pretty heavily influenced by mix of majors and differences in cost of living by geography. A Georgetown University study attempted to control for mix of majors, but not cost of living. Perhaps suspiciously, it ranked Georgetown tops.

The US News is not concerned with the truth, it is concerned with sales. If it were concerned wit truth, it would do a far better job of ensuring data accuracy, integrity and consistency. As it stands, the data inputs are so poorly collected and tabulated that the output is completely unreliable.

Ranking colleges is like comparing apples and Snickers bars and calling one “better.” I have just as much authority to publish a list as they do. The only difference is that I don’t own a media publishing company. Do your research and find a college that works for you. Schools are more similar than you realize.

One thing gives it away: the fact that it is published yearly. No complex institution changes that fast. At best, all USNews is really doing is measuring wealth and spending per student. Which is okay, as a blunt measure. However, IMO, more people are interested in affordability than how much the college spends on faculty and administration.

They are in the magazine sales business. Not the college business.

But as a general rule, its directionally ok. The top 10 per cent of students at any of the “top”100 universities are stars anywhere they go.

@circuitrider Yes, they measure “spending per student” as a pure mathematical exercise, but what it lost is that it doesn’t mean that the money is actually being “spent” on those undergraduates. The reality of higher ed is it could really be supporting graduates, faculty research, obscene bureaucratic bloat – all things that don’t really benefit undergraduates in any way.

That’s why it’s critical that people don’t just look at the ranking numbers but consider other undergrad relevant factors. When I was helping my son in selecting colleges, our top priorities were “undergraduate focused” institutions, so those liberal arts colleges that are exclusively undergrad or with very little grad programs were high on our list. We ultimately chose Princeton because 1) it has one of the lowest undergrad population AND more undergrad population than the grad among top schools; by its institutional design, they don’t have “glamorous” grad schools like law, medicine and business; 2) students to teacher ratio is one of the lowest if not the lowest and students are taught directly by professors, not grad students; 3) it’s #1 in endowment per student; 4) and add other criteria that USNWR uses in its methodology, such as reputation and other subjective factors.

USNWR is not without flaws but I find it useful as long as the consumer does additional indepth research to go along with the rankings to find that individual “fit.”

The US News rankings are no worse than any other set of rankings. They’re all subjective. The flaws people point out with the US News rankings are just as applicable to any other set of rankings. For example, someone mentioned that the US News rankings are suspect because they come out every year - yet the other rankings come out every year, too.

US News gets picked on both because it’s the most well-known ranking, so it’s the most visible target, and lots of people who went to big, public flagships hate it because their schools don’t score as high with US News as they do in other sets of rankings. US News is more undergraduate focused, while the other rankings tend to be based more on graduate programs and research, which are the strengths of the public flagships.

The rankings are fun to look at and add a little bit of insight, but I certainly wouldn’t suggest anyone pick a school based purely on rankings.

“On the other hand, CT teachers applaud high school graduates who are admitted to UMass since the college is known for tough admission process, high quality standard on OOS applicants.”

Lol. UMASS has one of the worst organized admission process of any college, just read the threads here on CC.

I think the teacher you spoke to has an ax to grind vs UCONN. The in state pool in CT is highly educated and more and more students are choosing the in state option because of the astronomical cost of private school. UCONN also offers merit scholarships including full tuition to many instate top students.

UCONN and UMASS are both fine flagships and based on certain majors, one may be better quality than the other or vice versa.

As suggested above, the premises of the question should be examined. In my year-old copy of U.S. News, UConn registers a higher SAT score range than UMass as well as a higher percentage of students from the top 10% of their HS classes. Consistent with these other figures, UConn’s overall acceptance rate for that year was lower. Though the question pertains as well to in-state vs. out-of-state admissions and the associated perceptions they create, the general statistics show UConn as the more selective school.