<p>I thought that it would actually go up</p>
<p>why is it falling?</p>
<p>I thought that it would actually go up</p>
<p>why is it falling?</p>
<p>It’s not falling, it has been fairly stable in USNWR for a few years mostly because breaking into the top 20 is difficult and takes times. If you’re referring to how Emory is 20th now and used to be 18th for a few years, that doesn’t mean anything. Emory’s score is 79 while ND and Vanderbilt (tied for 18th) have a score of 80, nothing to be concerned about. If you’re referring to another ranking let me know.</p>
<p>We actually should have been 16th in the rankings last year, but the administration misreported our alumni giving rate. No joke!</p>
<p>i think emory was ranked at 9th in 1998(!)
how was that possible back then?</p>
<p>Not sure about that, but really, the rankings are based on so many different factors that there may not be a real, tangible reason for why a school rises or falls in the rankings.</p>
<p>i just saw that article about misreporting alumni giving rate
if thats true, will emory go up a little bit this year?
btw, when is 2007 ranking coming out?</p>
<p>In August.</p>
<p>1998 was a fluke year in which they made graduate research a ridiculously highly weighted component. All of a sudden Caltech shot to 1, Hopkins was 7, and Emory was 9. They realized the ranking flaw and the ranks quickly settled back to where they have usually been.</p>
<p>In the first year that USNWR published their rankings, Stanford was #1. The ups and downs for different colleges in the past had to do with the way that USNWR kept changing its formula rather than actual changes by the colleges. USNWR has stabilized lately.</p>
<p>
Youre way off base on this slipper1234. What makes you think that research was ridiculously highly weighted? Im pretty sure the only justification for that is because HYP werent the top three not any a priori argument from the weights themselves. Indeed the system in 2000 (your year is off <a href=“http://thecenter.ufl.edu/usnewsranking.xls[/url]”>http://thecenter.ufl.edu/usnewsranking.xls</a> ) simply changed using the total spending amount instead of the relative ranking of schools for spending.</p>
<p>i.e. how it was before 2000, and how it was after is:
<a href=“http://www.washingtonmonthly.com/features/2000/0009.thompson.html[/url]”>http://www.washingtonmonthly.com/features/2000/0009.thompson.html</a></p>
<p>In 2000, they simply gave Caltech full credit for the dollars/student instead of simply saying they were the top spender. Now is that scientifically a more valid way of ranking institutions? It certainly is more intuitive why would the ranking between schools matter opposed to the actual dollars spent per student? However, you could argue that the weights are off, but the point is, there isnt a scientific, methodical system of weighting and includes quite a bit of arbitrariness. I would argue that the weights are chosen in order to make sure the correct schools end up on top. </p>
<p>From the same article:
So my point isnt that the ratings are useless, but that you should be aware that the first 5-10 spots are essentially used to calibrate the rankings to common perceptions. Consequently, when you say that they realized the rankings were flawed, it had little to do with the methodology being unscientific or flawed but rather the results of the rankings not aligning with the general readers perceptions.</p>