USNWR Circular Ratings

New research raises additional questions about the "reputational" survey that is worth 25 percent (more than any other factor) on the U.S. News & World Report rankings of colleges.</p>

<p>What the research found is that the reputational scores don't correlate with changes in factors such as resources or graduation rates, but correlate with the previous year's rankings. In other words, the way you get a good reputational score -- and in turn a good ranking -- is to already have a good ranking.</p>



News:</a> Circular Ratings - Inside Higher Ed</p>

<p>Haha - wasn't this something noted by some of us as far back as 05-16-2005! :)</p>

<p><a href=""&gt;;/a&gt;&lt;/p>



<p>Yes, the long-standing belief has now been proven. :) It is indeed a duh for many. ;)</p>

<p>^ Proven, perhaps not, but it is indeed a big "duh". I prefer to think of it as a negative feedback system that helps keeping rankings from jumping too much from year to year.</p>

<p>Why is it interesting that prestige begets prestige?</p>

<p>Are you talking about the Peer Assessment survey? That is one of the most useful factors in the US News rankings. About 90% of the Peer Assessment rating can be accounted for by hard data. There is a sound basis for the Peer Assessment ratings.</p>

<p>Interestingly, collegehelp, most of those factors are already included in USNWR. Gotta love some multi-colinearity.</p>

<p>Consequently, all data is 100% accounted for by data.</p>

How else could explain the amazingly high scores that have no correlation WHATSOEVER with their selectivity numbers?


Oh goody, another PA thread.</p>

<p>Well, xiggi, perhaps it correlates to other factors that bring distinction to an academic program?...such as the professors.</p>

<p>There is a high correlation between PA and selectivity.</p>

<p>The point is that the highest correlation for this year's PA is with last year's ranking, rendering PA superfluous.</p>

<p>Not true. It's superfluous if the same data is in the predictor twice. So this proves PA would be superfluous only if you would include the previous year's ranking AND PA. What this says instead is that PA is measuring something which, correlated as it may be to other elements of the ranking, is mostly correlated with itself. This is a good thing-- it means PA is capturing something which is not found in other numbers. It also means PA is somewhat static, which is perfectly reasonable and expected since reputation is slower to change and is largely a lagging indicator relative to changing quality. </p>

<p>Seriously, this is not news and can be quite easily and reasonable spun to sound positive for PA, and I'm no big fan of the PA or it's methodology.</p>

<p>Certainly you would expect PA to remain relatively constant. Why would you expect it to change much? It's not as though colleges change dramatically year to year. They barely change decade to decade.</p>

<p>The correlation between PA and the SAT 25th percentile is very high, about +.74 among the top 100 universities in US News that report SAT. You can calculate it yourself using an Excel spreadsheet. Just enter the data from US News and do a Pearson correlation between the two columns of data. Seeing is believing. You don't have to rely on anyone else's research. Another way to say this is that selectivity is 50% of the PA rating (.74 squared).</p>

<p>The research showed that if you want to know what next year's PA values will be, just look at this year's final overall ranking:
But the theory behind the study was that if these are key measures of quality in the magazine's view, institutions that change in these categories should also experience reputational changes over time. But they didn't -- while the correlation that was clear was reputation with the previous year's rankings.


The key is over time, rather than looking at one static year's values.</p>

<p>Caveat: I admit to being influenced by scientific-method peer-reviewed research.</p>

<p>Correlation is not causation. It's somewhat obvious that better students tend to select what are thought to be better colleges who you would expect to have a higher PA. But there are exceptions too.</p>

<p>If you want to know what your income will be you can get a pretty good idea from the last year's income. Shocking.</p>

<p>I can think of a number of schools that have moved up or down a few tenths of a point over 10 years.</p>

<p>Nobody wants to go to Ivy League schools anymore, they're too hard to get in. (Apologies to Yogi Berra).</p>

<p>So not only do changes in key measures of quality not cause a rise in PA, there's not even a correlation.</p>

<p>It is a good thing that PA does not change dramatically with random fluctuations in a few statistics. A good analogy is a college student's overall cumulative gpa. It doesn't change much as a result of one grade or one semester. It captures the big picture over time. The more history, the more it resists change. This is the way it should be. But a long stretch of poor performance WILL cause it to change.</p>

<p>So what is causing the identified changes in PA? It is not changes in key measures of quality, since there is no correlation. Since the correlation is the previous year's ranking, the reader is left to decide if it's a reasonable cause. The subject of the next research? ;)</p>