<p>
</p>
<p>Siserune, I’m not sure what was so vague in my comment.</p>
<p>For starters, it was a direct reply to. “Most PA-free rankings correlate with US News, so it would hardly be earth-shaking to drop PA or change its weight.” and my words were, “Siserune, the correlation of the PA and the ranking is circular, as the impact on the final rankings is so important.”</p>
<p>Is there really a need to explain this further? How hard is it to see that the PA (with its 25% weight) represents a major component and, unfortunately I may add, obliterates much of the more subtle differences that originate from the remaining 75%. The result is that a higher PA yields a higher total ranking. making a correlation circular, since it is exactly that higher PA that caused the higher ranking. </p>
<p>For what it is worth, the choice of weighing the PA by 25% is not an accident. It is a deliberate choice by USNews since it does allow them to retool the final rankings to stabilize (or destabilize) at will. The fact that the so-called “data” is compiled by an outside firm is not as important as the fact that US News decides how to numerically handicap the schools. Again, the reasons where clearly expressed by Morse: the PA’s purpose is to HELP the public schools that lag their “peer” private schools in almost all other categories. Morse calls it rewarding the intangibles! </p>
<p>Regarding your views that the PA aggregates a host of factors not otherwise included in the USNews formula. I would be a lot happier with the PA if that is what it DID! All we know is that USNews sends a survey and collates the results that range from honest and complete to entirely whimsical, as many have admitted. </p>
<p>Further, since you believe that the selectivity ranking should be revised (although the acceptance rate accounts for only 10% of the selectivity and 1.5% of the total score) why not apply the same logic to the PA. Why not break down the darn 25% in five to fiteen categories and let the WORLD see how the schools stack up and WHAT is exactly measured and how. This would be best accomplished by splitting the rankings in two categories. The “new” PA renking could be presented in the same manner as the rankings are done today: a score and many neat columns that illustrate the final score. This way we would not have to debate ad nauseam if the PA measures quality of education, dedication to teaching, or simply a bunch of fuzzy factors ranging from nostalgic toughts of the surveyee to the latest performance in the NCAA championship, or simply that Aunt Irma attended the school in the fifties. </p>
<p>While I would prefer two SEPARATE rankings, I realize that this would never happen as Morse won’t relinquish his “stabilizing” tool, I’d be happy to see the breakdown of the PA and have sufficient information to be able to “back it out” myself from the overall rankings. </p>
<p>A good start would be to offer this ability to the suscribers of the online version. There is little doubt that the technology exists. All it would be needed is to be able to assign a dummy value to the PA and re-rank the schools. </p>
<p>Wouldn’t that make everyone happy?</p>
<p>PS “PA reviewers answering based on the previous year’s US News ratings in filling their questionnaires” is exactlty what I think happens. Of course, it assumes that the reviewers who sign the survey are also filling it in, and not a secretary.</p>