<p>I just looked at the Princeton Review ranking of schools. My T just left for a school that was ranked as most beautiful campus last year and now is nowhere on the list. It looks to me as this is all very manipulative. I don't understand how one year it is number one and the next it doesn't make the list. </p>
<p>Now that I am preparing for my J (senior now in HS). I am very curious as to what everyone else thinks of these lists.
<p>The lists are helpful in that they can give you a ballpark estimate of what the typical admitted student looks like, and how many of these typical students are admitted. (example: 1 in 3 or 33%) We were more interested in things like freshman retention rate and class size. Other people are more interested in "most Greeks" or "best parties." </p>
<p>The problem is that the data they use is over a year old, so the figures you see were gleaned in this years' lists reflect the applicants that are now the class of 2008. For some schools, this data has changed quite a bit. A school that gets "hot" and is inundated with applications may be more selective one year, and it may drop off the next.</p>
<p>The data for all ranking systems are over a year old, including U.S. News. So that's nothing unique to PR. What I think produces some of the volatility in PR's rankings in the various categories is that they draw information from surveys of small samples of students, and sampling error alone can produce a lot of seeming change. It would be better if they combined results for several years, in fact, because frankly schools just don't change that much from year to year.</p>
<p>They are one source of information....the individual sets of data are prehaps more useful than the ranking, since rankings can and do move so often....it is unlikely that colleges get so much better or so much worse in the same period of time....general placement on rankings is about as valid as you can get...UCDavis staff made a clerical error reporting staff statistics to US News and they dropped several places in their rankings. I notice the Princeton Review statistics on Davis are wrong..for example they have Davis as DII but it is now a DI athletic program.</p>
<p>Not making the list didn't make your T's School ugly all of a sudden (has there been a twister on campus lately?)</p>
<p>Nothing is better than a good visit. When we looked at schools I also searched newspaper articles, police reports, resumes of people who went to the school...so we could have a pretty good idea of the range of offerrings. Good luck</p>
<p>The subjective rankings are based on student responses; I suspect the category rankings are therefore often fluid for that reason--it'a a matter of whoever feels like filling out a form, and in some years schools may have more form-fillers than others. I have noticed abrupt changes in rankings in the quality of life categories that have little or nothing to do with the reality of the experience and presumably a lot to do with who (and how many) responded.</p>
<p>Somewhere last year there was article on how the PR reviews are done, which is basically by grabbing a few students at each place and asking their opinions. So, maybe they found some aesthetically-challenged students at your son's college last year?</p>
<p>One of the favorites is the ranking of best college town or some such, which ranked Barnard much higher than Columbia. They're right across the street from each other.</p>
<p>Right, that's my point. They use small samples and the methodology is, at best, suspect. The only reason I would place much stock in the results is that my alma mater came out so high this year. But basically those student ratings should be read as "just fun" and not scientifically based.</p>
<p>No, the campus is still beautiful. But that is my point. Why do these lists vacilate so much. Having to go through the college process two years in a row just makes it so much more apparent.