US News In September-Implications

<p>US News will be issuing its college rankings a month later in September this year. Since this is the case, does it mean that it will be using 2010-11 selectivity data rather than one year behind, 2009-10, as has been the custom in the past?</p>

<p>The big implication of this is that Columbia with its 30% plus increase in applicants may now be poised to move past one or two or all three of HYP by improving it's selectivity measures over where it ranked last year, which was third.</p>

<p>Food for thought.</p>

<p>I do not think it would be feasible to have such a high number of faulty assumptions in one single post. But you did it! </p>

<p>Here are a few hints:</p>

<ol>
<li>Ask yourself which was the data used by USNEWS in its PAST edition (BC 2011)?<br></li>
<li>Ask yourself what is the weight of the admission ratio in the selectivity index?</li>
<li>Check the rate of increase between 2009-10 and 2010-2011 - it is less than THREE percent?<br></li>
<li> Ask yourself why the data for the students entering Fall 2011 is not available yet. Hint ... the school year has not started and summer melt is still ... on! </li>
<li>Will Bob Morse allow another mistake to impact the ranking of Columbia this year and allow the school to present smoke and mirror data again?</li>
</ol>

<p>OP, so what?</p>

<p>
[quote]
Here are a few hints:

[/quote]

  1. What will make the most money?
  2. What will make the most money?
  3. What will make the most money?
  4. What will make the most money?
  5. What will make the most money?!</p>

<p>US News hasn't let a non-HYP school into the top 3 in the past 10 years. Over the past 20 years, only three schools have done it - Stanford twice, and MIT and Caltech once, but the person who put Caltech at #1 and MIT at #3 one year was fired the next, so that might not count. Discounting that anomalous year, in the life of the US News rankings, Stanford is the only school that's done it (5 times in total). That happened about once every ten years or so. Since it's about time they did that again, does it seem likely that Columbia will be the choice? No. It was already overshooting Columbia by putting it at #4 last year, since Columbia didn't start routinely breaking the top 5-10 until very recently.</p>

<p>U.S</a>. News Rankings Through the Years</p>

<p>In other words, 10- or 20- year trends seem more important to US News than the 'trend' (if you can call it that) of the past 2-3 years.</p>

<p>Maybe USN can go straight to print with just the selectivity data for 10-11 inputted. Waiting ... waiting ... waiting... those quarter schools are holding things up! ;)</p>

<p>And besides, the selectivity index is hardly affected by acceptance rate. The key factors are SAT's and % of students in top-10% of hs, which everyone fudges anyway. Many fudge on the SAT's also, besides the apples and oranges comparison of those which superscore and those which don't.</p>

<p>USN is still garbage...</p>

<p>
[QUOTE]
Over the past 20 years, only three schools have done it - Stanford twice, and MIT and Caltech once.

[/QUOTE]
</p>

<p>Duke was also ranked #3 by US News in 1998 - tied with Yale. Not that I'm suggesting it means anything, just that there have been 4 non-HYP schools ranked in the top 3 at least once. US News has flat out stated that they "know" their rankings are valid because HYP are at the top. That is a justification they've used when questioned for validity. They choose criteria and percentages to ensure that and adjust them as needed so HYP are at the top every year - it fits their preconceived notions.</p>

<p>When deciding what will be #1 in certain categories, remember who has the power. The founding owner went to Princeton (often #1) and now the current owner went to Penn Wharton (#1 business) and Harvard (often #1 for undergrad, too) Law School (HLS always near top for law school, currently #2).</p>

<p>ooops, sorry vienna man...</p>

<p>Let's just say the collection of data by the u's trails significantly. And I don't think they would skip over one year for it to be as updated as possible, esp since data for admissions doesn't change that drastically. I don't think Columbia's significant increase in apps will change the stats of those who matriculate.</p>

<p>You cannot have enrolled data until after they enroll. That's what is in the CDS. Most schools calculate that several weeks into the semester. Schools on the quarter do not even start until late September.</p>

<p>Let's fix this thread:</p>

<p>vienna man asked:</p>

<p>
[quote]
Since this is the case [that USN will be released in Sept], does it mean that it will be using 2010-11 selectivity data rather than one year behind, 2009-10, as has been the custom in the past?

[/quote]
</p>

<p>which s/b corrected to:</p>

<p>
[quote]
Since this is the case, does it mean that it will be using [2011-12] selectivity data rather than one year behind, [2010-11], as has been the custom in the past?

[/quote]
</p>

<p>xiggi followed with something to the effect that 'summer melt could still affect the enrollments for 2010-11' but meant 2011-12.</p>

<p>I glanced at xiggi's post and said what I said in post #6, with no time to edit, obviously meaning 2011-12 also. ;)</p>

<p>Overall, the 2012 USNWR's Rankings of National U's at its latest and best has to settle for 2010-11 selectivity data because 2011-2012's matriculation data, and therefore stats for the class of 2015 wrt each u, will not be available at the time of publication in September, 2011.</p>

<p>
[quote]
Maybe USN can go straight to print with just the selectivity data for 10-11 inputted. Waiting ... waiting ... waiting... those quarter schools are holding things up!

[/quote]
</p>

<p>Quarter schools have nothing to do with the decision by USNews to release the rankings in September. Delaying the release by a few weeks is actually a very good thing has it make the title of the special edition more meaningful. The Best Colleges of 2012 is indeed meant to help the students who apply in 2012 --with a few in the fall/winter of 2011.</p>

<p>The misnomer of USNews does not come from using older data, but from its title itself. The data that is used in the upcoming edition IS the most recent available. Schools have started releasing the CDS and filled the surveys from October 2010 to ... this date. </p>

<p>As far as "inputting" data for the Fall of 2011 cohort, that is pure heresy. The numbers presented by the OP might be correct, but are subject to verification. It is not because TheChoice blog reported it that such numbers come close to the reality. Schools are still culling students from their waiting lists as students chenge their minds. </p>

<p>As far as Columbia goes, it will have its day in the 2013 Edition. As it stands, it repaed enormous benefits from USNews mistake in the report of the Peer Assessment. The rise in application inumbers for the Fall of 2011 can be traced directly to last year's wrong ranking. So, this will be a double whammy for Columbia. The year or the overly generous ranking and the year of the increase in applications. </p>

<p>This said, the rate of admission does not account for more than 1.5 percent of the total score. The obviouly creative (read fabricated) numbers for student rankings reported by Columbia have a much larger impact.</p>

<p>
[quote]
Let's fix this thread:

[/quote]
</p>

<p>My posts did not need "fixing." </p>

<p>
[quote]
xiggi followed with something to the effect that 'summer melt could still affect the enrollments for 2010-11' but meant 2011-12.

[/quote]
</p>

<p>You need to pay closer attention. I did not write nor meant that the summer melt of 2011 impacts the class of 2010-2011. </p>

<p>Try again! Read before commenting. It is much better than glancing! </p>

<p>Here it is:</p>

<p>


</p>

<p>
[quote]
As far as "inputting" data for the Fall of 2011 cohort, that is pure heresy.

[/quote]

Wow, you take this seriously...</p>

<p>^^
liberal use of a word!</p>

<p>xiggi, what did you mean by "the USNews mistake in the report of the Peer Assessment"? Did Columbia have a big jump in PA, so that's why it got bumped up?</p>

<p>I had thought the jump in applications had more to do with their switch to the Common App, but its higher ranking (above schools like Stanford and MIT) probably also had a role.</p>

<p>Bob Morse and his staff messed up Stanford's calculations for one the new Peer Assessments. This catapulted Columbia past Stanford.</p>

<p>USNews "corrected" the online report but, for obvious reasons, could not change the printed version nor retabulate the rankings.</p>

<p>Interesting - so Stanford's PA score dropped initially. Do you happen to know whether the corrected PA score is the same from the past or how far it had dropped before it was corrected?</p>

<p>I also just read that US News completely changed the PA (I'd heard about the counselors' survey but not that it was incorporated in favor of reducing the academics' weighted opinion to 15%), so that probably helped Columbia.</p>

<p>Last year, USNews introduced a few changes, including a change in the Peer Assessment. The resulting Academic Index seems to exhibit a rather strange behavior. </p>

<p>The definition of the new Undergraduate academic reputation index intimates that it is a weighted average of the Peer Assessment score and the High School Counselor score. </p>

<p>But is it? </p>

<p>Harvard
Peer assessment score (out of 5) 4.9
High School Counselor score (out of 5): 4.9
Undergraduate academic reputation index (100=highest) 98 </p>

<p>Princeton
Peer assessment score (out of 5) 4.9
High School Counselor score (out of 5): 4.9
Undergraduate academic reputation index (100=highest) 98</p>

<p>MIT
Peer assessment score (out of 5) 4.9
High School Counselor score (out of 5): 4.9
Undergraduate academic reputation index (100=highest) 98 </p>

<p>Yale
Peer assessment score (out of 5) 4.8
High School Counselor score (out of 5): 4.9
Undergraduate academic reputation index (100=highest) 97 </p>

<p>Stanford
Peer assessment score (out of 5) 4.9
High School Counselor score (out of 5): 4.9
Undergraduate academic reputation index (100=highest) 93 </p>

<p>Columbia
Peer assessment score (out of 5) 4.6
High School Counselor score (out of 5): 4.8
Undergraduate academic reputation index (100=highest) 93 </p>

<p>Caltech
Peer assessment score (out of 5) 4.6
High School Counselor score (out of 5): 4.6
Undergraduate academic reputation index (100=highest) 92 </p>

<p>Penn
Peer assessment score (out of 5) 4.5
High School Counselor score (out of 5): 4.6
Undergraduate academic reputation index (100=highest) 91 </p>

<p>Inquiring minds would like to know what kind of weigh could be applied to the average to yield such results</p>

<p>Here's a simple table that shows the results could be as simple as adding the numbers:</p>

<p>School PA - GC - TOTAL
MIT 49 49 98
Yale 48 49 97
Stanford 49 49 93
Columbia 46 48 93

Caltech 46 46 92
Penn 45 46 91</p>

<p>However, we know that the PA at 15% is twice the weight of the GCs score that is 7.5%</p>

<p>So, let's present a very simple table that doubles the weight of the PA.</p>

<p>School PA - PA - GC - WEIGHTED TOTAL - ORIGINAL SCORE
MIT 49 - 49 - 49 - 98.00 - 98
Yale 48 - 48 - 49 - 96.67 - 97
Stanford 49 - 49 - 49 - 98.00 - 93
Columbia 46 - 46 - 48 - 93.33 - 93
Caltech 46 - 46 - 46 - 92.00 - 92
Penn 45 - 45 - 46 - 90.67 - 91</p>

<p>The numbers work pretty well, except for one school. One has to wonder what Stanford did to Morse?</p>

<p>^ wow, you'd have thought that they'd double-check their numbers before tabulating their ranking. And when there's something that catches people by surprise (even Columbia people!), e.g. Columbia above Stanford, they'd parse it out to find what led to the difference. That's just strange.</p>

<p>Google didn't turn up any 'notice' about this, so I assume it wasn't a publicized error (a statement of "oops, my bad"), but rather some email exchange or another.</p>

<p>Either way, that doesn't bode well for Columbia next year. They couldn't possibly make the mistake again... unless Morse really does have it out for Stanford.</p>